Mar 13 09:11:56 crc systemd[1]: Starting Kubernetes Kubelet... Mar 13 09:11:56 crc restorecon[4686]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:56 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 09:11:57 crc restorecon[4686]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 09:11:57 crc restorecon[4686]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 13 09:11:57 crc kubenswrapper[4841]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 09:11:57 crc kubenswrapper[4841]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 13 09:11:57 crc kubenswrapper[4841]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 09:11:57 crc kubenswrapper[4841]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 09:11:57 crc kubenswrapper[4841]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 13 09:11:57 crc kubenswrapper[4841]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.712694 4841 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.725921 4841 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.725959 4841 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.725968 4841 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.725978 4841 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.725986 4841 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.725995 4841 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726006 4841 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726018 4841 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726026 4841 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726035 4841 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726044 4841 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726054 4841 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726065 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726073 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726081 4841 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726090 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726097 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726105 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726113 4841 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726120 4841 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726128 4841 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726136 4841 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726144 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726154 4841 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726163 4841 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726172 4841 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726180 4841 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726187 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726195 4841 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726203 4841 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726225 4841 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726233 4841 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726241 4841 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726249 4841 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726256 4841 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726287 4841 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726296 4841 feature_gate.go:330] unrecognized feature gate: Example Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726304 4841 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726312 4841 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726320 4841 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726327 4841 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726337 4841 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726345 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726354 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726364 4841 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726372 4841 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726380 4841 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726388 4841 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726396 4841 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726403 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726411 4841 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726419 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726426 4841 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726434 4841 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726442 4841 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726449 4841 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726457 4841 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726464 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726471 4841 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726479 4841 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726486 4841 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726494 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726502 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726509 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726517 4841 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726525 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726532 4841 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726540 4841 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726553 4841 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726562 4841 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.726571 4841 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726714 4841 flags.go:64] FLAG: --address="0.0.0.0" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726731 4841 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726747 4841 flags.go:64] FLAG: --anonymous-auth="true" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726758 4841 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726771 4841 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726780 4841 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726792 4841 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726803 4841 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726813 4841 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726821 4841 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726831 4841 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726842 4841 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726852 4841 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726861 4841 flags.go:64] FLAG: --cgroup-root="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726869 4841 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726878 4841 flags.go:64] FLAG: --client-ca-file="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726887 4841 flags.go:64] FLAG: --cloud-config="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726896 4841 flags.go:64] FLAG: --cloud-provider="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726905 4841 flags.go:64] FLAG: --cluster-dns="[]" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726915 4841 flags.go:64] FLAG: --cluster-domain="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726924 4841 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726933 4841 flags.go:64] FLAG: --config-dir="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726942 4841 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726952 4841 flags.go:64] FLAG: --container-log-max-files="5" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726963 4841 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726971 4841 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726980 4841 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726990 4841 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.726999 4841 flags.go:64] FLAG: --contention-profiling="false" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727008 4841 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727017 4841 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727026 4841 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727035 4841 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727046 4841 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727056 4841 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727065 4841 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727073 4841 flags.go:64] FLAG: --enable-load-reader="false" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727082 4841 flags.go:64] FLAG: --enable-server="true" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727091 4841 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727103 4841 flags.go:64] FLAG: --event-burst="100" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727112 4841 flags.go:64] FLAG: --event-qps="50" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727121 4841 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727130 4841 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727139 4841 flags.go:64] FLAG: --eviction-hard="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727150 4841 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727159 4841 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727168 4841 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727177 4841 flags.go:64] FLAG: --eviction-soft="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727187 4841 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727196 4841 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727205 4841 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727215 4841 flags.go:64] FLAG: --experimental-mounter-path="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727223 4841 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727233 4841 flags.go:64] FLAG: --fail-swap-on="true" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727241 4841 flags.go:64] FLAG: --feature-gates="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727252 4841 flags.go:64] FLAG: --file-check-frequency="20s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727261 4841 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727293 4841 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727303 4841 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727312 4841 flags.go:64] FLAG: --healthz-port="10248" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727321 4841 flags.go:64] FLAG: --help="false" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727330 4841 flags.go:64] FLAG: --hostname-override="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727338 4841 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727348 4841 flags.go:64] FLAG: --http-check-frequency="20s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727357 4841 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727366 4841 flags.go:64] FLAG: --image-credential-provider-config="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727375 4841 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727383 4841 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727392 4841 flags.go:64] FLAG: --image-service-endpoint="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727402 4841 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727411 4841 flags.go:64] FLAG: --kube-api-burst="100" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727420 4841 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727429 4841 flags.go:64] FLAG: --kube-api-qps="50" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727438 4841 flags.go:64] FLAG: --kube-reserved="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727447 4841 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727456 4841 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727465 4841 flags.go:64] FLAG: --kubelet-cgroups="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727474 4841 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727484 4841 flags.go:64] FLAG: --lock-file="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727492 4841 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727502 4841 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727511 4841 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727533 4841 flags.go:64] FLAG: --log-json-split-stream="false" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727544 4841 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727553 4841 flags.go:64] FLAG: --log-text-split-stream="false" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727563 4841 flags.go:64] FLAG: --logging-format="text" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727571 4841 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727581 4841 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727590 4841 flags.go:64] FLAG: --manifest-url="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727598 4841 flags.go:64] FLAG: --manifest-url-header="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727610 4841 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727618 4841 flags.go:64] FLAG: --max-open-files="1000000" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727629 4841 flags.go:64] FLAG: --max-pods="110" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727638 4841 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727647 4841 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727656 4841 flags.go:64] FLAG: --memory-manager-policy="None" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727665 4841 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727675 4841 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727684 4841 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727694 4841 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727712 4841 flags.go:64] FLAG: --node-status-max-images="50" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727721 4841 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727730 4841 flags.go:64] FLAG: --oom-score-adj="-999" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727739 4841 flags.go:64] FLAG: --pod-cidr="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727748 4841 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727761 4841 flags.go:64] FLAG: --pod-manifest-path="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727770 4841 flags.go:64] FLAG: --pod-max-pids="-1" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727778 4841 flags.go:64] FLAG: --pods-per-core="0" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727787 4841 flags.go:64] FLAG: --port="10250" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727796 4841 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727805 4841 flags.go:64] FLAG: --provider-id="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727814 4841 flags.go:64] FLAG: --qos-reserved="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727823 4841 flags.go:64] FLAG: --read-only-port="10255" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727831 4841 flags.go:64] FLAG: --register-node="true" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727840 4841 flags.go:64] FLAG: --register-schedulable="true" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727850 4841 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727865 4841 flags.go:64] FLAG: --registry-burst="10" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727874 4841 flags.go:64] FLAG: --registry-qps="5" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727883 4841 flags.go:64] FLAG: --reserved-cpus="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727894 4841 flags.go:64] FLAG: --reserved-memory="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727905 4841 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727914 4841 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727923 4841 flags.go:64] FLAG: --rotate-certificates="false" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727932 4841 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727940 4841 flags.go:64] FLAG: --runonce="false" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727949 4841 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727959 4841 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727968 4841 flags.go:64] FLAG: --seccomp-default="false" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727977 4841 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727985 4841 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.727995 4841 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728005 4841 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728014 4841 flags.go:64] FLAG: --storage-driver-password="root" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728023 4841 flags.go:64] FLAG: --storage-driver-secure="false" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728032 4841 flags.go:64] FLAG: --storage-driver-table="stats" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728041 4841 flags.go:64] FLAG: --storage-driver-user="root" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728050 4841 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728060 4841 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728069 4841 flags.go:64] FLAG: --system-cgroups="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728077 4841 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728091 4841 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728100 4841 flags.go:64] FLAG: --tls-cert-file="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728109 4841 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728120 4841 flags.go:64] FLAG: --tls-min-version="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728128 4841 flags.go:64] FLAG: --tls-private-key-file="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728137 4841 flags.go:64] FLAG: --topology-manager-policy="none" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728146 4841 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728155 4841 flags.go:64] FLAG: --topology-manager-scope="container" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728164 4841 flags.go:64] FLAG: --v="2" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728175 4841 flags.go:64] FLAG: --version="false" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728186 4841 flags.go:64] FLAG: --vmodule="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728197 4841 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.728207 4841 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728431 4841 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728441 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728452 4841 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728461 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728470 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728478 4841 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728486 4841 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728495 4841 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728503 4841 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728511 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728519 4841 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728527 4841 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728535 4841 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728543 4841 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728551 4841 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728559 4841 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728566 4841 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728576 4841 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728587 4841 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728598 4841 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728607 4841 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728617 4841 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728627 4841 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728636 4841 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728644 4841 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728652 4841 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728660 4841 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728673 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728681 4841 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728688 4841 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728696 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728704 4841 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728711 4841 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728719 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728729 4841 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728736 4841 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728744 4841 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728752 4841 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728762 4841 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728770 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728778 4841 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728786 4841 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728793 4841 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728801 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728809 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728819 4841 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728827 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728836 4841 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728845 4841 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728853 4841 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728862 4841 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728871 4841 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728879 4841 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728888 4841 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728896 4841 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728904 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728912 4841 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728919 4841 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728927 4841 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728937 4841 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728945 4841 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728952 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728960 4841 feature_gate.go:330] unrecognized feature gate: Example Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728968 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728975 4841 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728983 4841 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.728993 4841 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.729001 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.729009 4841 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.729016 4841 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.729024 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.729047 4841 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.741731 4841 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.741787 4841 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.741926 4841 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.741941 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.741951 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.741961 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.741969 4841 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.741977 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.741986 4841 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.741994 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742003 4841 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742011 4841 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742020 4841 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742029 4841 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742037 4841 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742046 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742055 4841 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742063 4841 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742071 4841 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742079 4841 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742088 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742096 4841 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742107 4841 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742118 4841 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742129 4841 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742137 4841 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742147 4841 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742155 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742165 4841 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742176 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742186 4841 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742195 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742204 4841 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742215 4841 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742225 4841 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742233 4841 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742244 4841 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742253 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742286 4841 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742295 4841 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742303 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742312 4841 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742320 4841 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742328 4841 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742337 4841 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742345 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742353 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742361 4841 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742369 4841 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742377 4841 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742385 4841 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742394 4841 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742402 4841 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742410 4841 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742417 4841 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742425 4841 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742433 4841 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742441 4841 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742449 4841 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742456 4841 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742464 4841 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742472 4841 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742480 4841 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742487 4841 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742494 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742502 4841 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742509 4841 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742517 4841 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742525 4841 feature_gate.go:330] unrecognized feature gate: Example Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742533 4841 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742540 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742548 4841 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742557 4841 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.742570 4841 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742804 4841 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742819 4841 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742829 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742837 4841 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742846 4841 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742855 4841 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742865 4841 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742873 4841 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742881 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742889 4841 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742896 4841 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742904 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742911 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742919 4841 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742927 4841 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742936 4841 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742944 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742951 4841 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742958 4841 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742969 4841 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742981 4841 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742989 4841 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.742996 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743004 4841 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743013 4841 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743020 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743028 4841 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743036 4841 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743043 4841 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743051 4841 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743058 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743066 4841 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743074 4841 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743081 4841 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743090 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743098 4841 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743105 4841 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743113 4841 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743121 4841 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743129 4841 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743137 4841 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743145 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743152 4841 feature_gate.go:330] unrecognized feature gate: Example Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743160 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743167 4841 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743175 4841 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743183 4841 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743191 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743199 4841 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743207 4841 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743215 4841 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743223 4841 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743233 4841 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743243 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743253 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743285 4841 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743295 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743306 4841 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743316 4841 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743324 4841 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743332 4841 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743342 4841 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743351 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743359 4841 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743367 4841 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743374 4841 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743382 4841 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743390 4841 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743398 4841 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743406 4841 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.743415 4841 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.743427 4841 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.743697 4841 server.go:940] "Client rotation is on, will bootstrap in background" Mar 13 09:11:57 crc kubenswrapper[4841]: E0313 09:11:57.751320 4841 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.756418 4841 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.756602 4841 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.758786 4841 server.go:997] "Starting client certificate rotation" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.758846 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.759015 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.786634 4841 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.789242 4841 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 09:11:57 crc kubenswrapper[4841]: E0313 09:11:57.789718 4841 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.808452 4841 log.go:25] "Validated CRI v1 runtime API" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.842114 4841 log.go:25] "Validated CRI v1 image API" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.847215 4841 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.855241 4841 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-13-09-07-49-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.855320 4841 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.882883 4841 manager.go:217] Machine: {Timestamp:2026-03-13 09:11:57.879049028 +0000 UTC m=+0.608949289 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:ea440d89-cc01-4ae5-8bed-355549945eed BootID:256dcd84-6905-44f2-86c8-027636af7678 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:47:26:05 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:47:26:05 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:79:38:0b Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:ba:64:90 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:19:43:3a Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:3c:13:b8 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:0a:29:02:4e:17:63 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:82:f6:00:a0:8b:09 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.883286 4841 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.883453 4841 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.885900 4841 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.886203 4841 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.886258 4841 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.886653 4841 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.886673 4841 container_manager_linux.go:303] "Creating device plugin manager" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.887544 4841 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.887595 4841 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.888638 4841 state_mem.go:36] "Initialized new in-memory state store" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.888771 4841 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.895504 4841 kubelet.go:418] "Attempting to sync node with API server" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.895547 4841 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.895615 4841 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.895687 4841 kubelet.go:324] "Adding apiserver pod source" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.895713 4841 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.900615 4841 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.902571 4841 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.903786 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.903796 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 13 09:11:57 crc kubenswrapper[4841]: E0313 09:11:57.904024 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 13 09:11:57 crc kubenswrapper[4841]: E0313 09:11:57.903916 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.904589 4841 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.906519 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.906750 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.906911 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.907059 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.907225 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.907427 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.907579 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.907738 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.907918 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.908066 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.908253 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.908450 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.909889 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.910901 4841 server.go:1280] "Started kubelet" Mar 13 09:11:57 crc systemd[1]: Started Kubernetes Kubelet. Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.914626 4841 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.915861 4841 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.916450 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.917353 4841 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.921394 4841 server.go:460] "Adding debug handlers to kubelet server" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.929859 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.929917 4841 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.930465 4841 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.930512 4841 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.930703 4841 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 13 09:11:57 crc kubenswrapper[4841]: W0313 09:11:57.931247 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 13 09:11:57 crc kubenswrapper[4841]: E0313 09:11:57.931351 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:11:57 crc kubenswrapper[4841]: E0313 09:11:57.931359 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.931528 4841 factory.go:55] Registering systemd factory Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.931555 4841 factory.go:221] Registration of the systemd container factory successfully Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.933033 4841 factory.go:153] Registering CRI-O factory Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.933069 4841 factory.go:221] Registration of the crio container factory successfully Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.933160 4841 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.933189 4841 factory.go:103] Registering Raw factory Mar 13 09:11:57 crc kubenswrapper[4841]: E0313 09:11:57.933147 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="200ms" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.933210 4841 manager.go:1196] Started watching for new ooms in manager Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.935391 4841 manager.go:319] Starting recovery of all containers Mar 13 09:11:57 crc kubenswrapper[4841]: E0313 09:11:57.933060 4841 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.106:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189c5ba51832d479 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.910848633 +0000 UTC m=+0.640748864,LastTimestamp:2026-03-13 09:11:57.910848633 +0000 UTC m=+0.640748864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.945751 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.945907 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.945998 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946032 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946085 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946171 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946191 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946208 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946227 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946283 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946346 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946376 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946456 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946566 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946622 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946658 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946699 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946716 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946733 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946747 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946787 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946803 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946817 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946877 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946918 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946966 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.946985 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947002 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947016 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947067 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947151 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947230 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947252 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947306 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947321 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947335 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947350 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947418 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947498 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947552 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947564 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947579 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947590 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947628 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947641 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947655 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947751 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947781 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947844 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947860 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947871 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947914 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947952 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.947965 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948004 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948109 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948177 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948189 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948201 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948234 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948246 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948257 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948338 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948427 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948440 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948483 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948495 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948592 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948606 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948659 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948675 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948702 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948734 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948785 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948849 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948906 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948922 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948932 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.948962 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949012 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949024 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949101 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949173 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949186 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949198 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949211 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949245 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949366 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949438 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949469 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949489 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949530 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949543 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949555 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949568 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949605 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949617 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949648 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949680 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949692 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949705 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.949731 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.951619 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.951668 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.951720 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.951757 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.951792 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.951815 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.951839 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.951861 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.951879 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.951905 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.951928 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.951945 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.951965 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.951979 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.951994 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952014 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952028 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952042 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952058 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952073 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952092 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952105 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952118 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952136 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952151 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952168 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952181 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952195 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952214 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952228 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952245 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952282 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952298 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952314 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952327 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952343 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952357 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952369 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.952386 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.957902 4841 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.958677 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.958749 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.958773 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.958795 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.958821 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.958849 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.958872 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.958892 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.958913 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.958933 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.958957 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.958977 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959008 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959028 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959048 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959068 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959088 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959160 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959183 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959205 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959228 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959249 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959298 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959320 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959340 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959361 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959384 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959403 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959422 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959442 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959464 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959484 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959536 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959556 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959576 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959594 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959614 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959638 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959657 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959676 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959704 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959724 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959745 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959765 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959785 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959805 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959823 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959842 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959861 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959885 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959908 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959928 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959948 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959967 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.959986 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.960007 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.960025 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.960045 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.960068 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.960089 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.960109 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.960130 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.960149 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.960169 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.960189 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.960208 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.960227 4841 reconstruct.go:97] "Volume reconstruction finished" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.960241 4841 reconciler.go:26] "Reconciler: start to sync state" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.966452 4841 manager.go:324] Recovery completed Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.980519 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.982658 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.982708 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.982720 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.983396 4841 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.983412 4841 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.983434 4841 state_mem.go:36] "Initialized new in-memory state store" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.990403 4841 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.993631 4841 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.993689 4841 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 13 09:11:57 crc kubenswrapper[4841]: I0313 09:11:57.993747 4841 kubelet.go:2335] "Starting kubelet main sync loop" Mar 13 09:11:57 crc kubenswrapper[4841]: E0313 09:11:57.993828 4841 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 09:11:58 crc kubenswrapper[4841]: W0313 09:11:58.000824 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 13 09:11:58 crc kubenswrapper[4841]: E0313 09:11:58.000932 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.005536 4841 policy_none.go:49] "None policy: Start" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.006623 4841 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.006669 4841 state_mem.go:35] "Initializing new in-memory state store" Mar 13 09:11:58 crc kubenswrapper[4841]: E0313 09:11:58.032421 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.059251 4841 manager.go:334] "Starting Device Plugin manager" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.059348 4841 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.059377 4841 server.go:79] "Starting device plugin registration server" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.059943 4841 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.059992 4841 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.060258 4841 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.060391 4841 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.060414 4841 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 09:11:58 crc kubenswrapper[4841]: E0313 09:11:58.073842 4841 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.094241 4841 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.094421 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.096333 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.096394 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.096412 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.096606 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.096795 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.096837 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.097928 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.097985 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.097996 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.098006 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.098064 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.098086 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.098318 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.098338 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.098367 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.099742 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.099778 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.099784 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.099789 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.099817 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.099854 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.100047 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.100085 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.100137 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.101439 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.101474 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.101476 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.101513 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.101533 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.101487 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.101755 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.101898 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.101971 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.102970 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.103021 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.103038 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.103218 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.103309 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.103330 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.103595 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.103668 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.105182 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.105231 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.105252 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:11:58 crc kubenswrapper[4841]: E0313 09:11:58.134404 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="400ms" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.160561 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.161882 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.161906 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.161926 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.161942 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.161944 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.161971 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.161977 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.162004 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.162032 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.162105 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.162157 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.162235 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.162407 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.162460 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.162504 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.162544 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: E0313 09:11:58.162539 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.162603 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.162649 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.162691 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.264438 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.264501 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.264545 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.264587 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.264635 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.264678 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.264691 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.264717 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.264752 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.264909 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.264987 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.264807 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.264770 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.264802 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.264810 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.264769 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.265076 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.265161 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.265251 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.265365 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.265457 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.265639 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.265700 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.265742 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.265811 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.265963 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.265890 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.266032 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.266137 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.266220 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.363513 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.365129 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.365178 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.365195 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.365226 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 09:11:58 crc kubenswrapper[4841]: E0313 09:11:58.365730 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.429787 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.436966 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.460764 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.467950 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.472983 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:11:58 crc kubenswrapper[4841]: W0313 09:11:58.484991 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-363aedfbff130cfe6390afc42cadb4beb518e0ccc9d7a83d645be5b408d785ae WatchSource:0}: Error finding container 363aedfbff130cfe6390afc42cadb4beb518e0ccc9d7a83d645be5b408d785ae: Status 404 returned error can't find the container with id 363aedfbff130cfe6390afc42cadb4beb518e0ccc9d7a83d645be5b408d785ae Mar 13 09:11:58 crc kubenswrapper[4841]: W0313 09:11:58.485978 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-dfa2095b1379332805539c1d553f3e010bd9cb27419b52c4267f6f1ae2f4efb1 WatchSource:0}: Error finding container dfa2095b1379332805539c1d553f3e010bd9cb27419b52c4267f6f1ae2f4efb1: Status 404 returned error can't find the container with id dfa2095b1379332805539c1d553f3e010bd9cb27419b52c4267f6f1ae2f4efb1 Mar 13 09:11:58 crc kubenswrapper[4841]: W0313 09:11:58.498492 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-79535b430a3e7414a255bc0ef1306cd42604192b82239b037c5b8790c9920aa7 WatchSource:0}: Error finding container 79535b430a3e7414a255bc0ef1306cd42604192b82239b037c5b8790c9920aa7: Status 404 returned error can't find the container with id 79535b430a3e7414a255bc0ef1306cd42604192b82239b037c5b8790c9920aa7 Mar 13 09:11:58 crc kubenswrapper[4841]: W0313 09:11:58.505456 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-6849324ed814e5324452fd36a30647ada128fe8b5f4cf5d6300cdca96b0c58b4 WatchSource:0}: Error finding container 6849324ed814e5324452fd36a30647ada128fe8b5f4cf5d6300cdca96b0c58b4: Status 404 returned error can't find the container with id 6849324ed814e5324452fd36a30647ada128fe8b5f4cf5d6300cdca96b0c58b4 Mar 13 09:11:58 crc kubenswrapper[4841]: W0313 09:11:58.507651 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-a125b3e0158748e6fe8ea686299a8a175b049db513dece231b1471b309fcaa4a WatchSource:0}: Error finding container a125b3e0158748e6fe8ea686299a8a175b049db513dece231b1471b309fcaa4a: Status 404 returned error can't find the container with id a125b3e0158748e6fe8ea686299a8a175b049db513dece231b1471b309fcaa4a Mar 13 09:11:58 crc kubenswrapper[4841]: E0313 09:11:58.536652 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="800ms" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.766693 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.768082 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.768123 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.768135 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.768159 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 09:11:58 crc kubenswrapper[4841]: E0313 09:11:58.768577 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Mar 13 09:11:58 crc kubenswrapper[4841]: W0313 09:11:58.793118 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 13 09:11:58 crc kubenswrapper[4841]: E0313 09:11:58.793256 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 13 09:11:58 crc kubenswrapper[4841]: I0313 09:11:58.917558 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 13 09:11:59 crc kubenswrapper[4841]: I0313 09:11:59.004390 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a125b3e0158748e6fe8ea686299a8a175b049db513dece231b1471b309fcaa4a"} Mar 13 09:11:59 crc kubenswrapper[4841]: I0313 09:11:59.006159 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6849324ed814e5324452fd36a30647ada128fe8b5f4cf5d6300cdca96b0c58b4"} Mar 13 09:11:59 crc kubenswrapper[4841]: I0313 09:11:59.007412 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"79535b430a3e7414a255bc0ef1306cd42604192b82239b037c5b8790c9920aa7"} Mar 13 09:11:59 crc kubenswrapper[4841]: I0313 09:11:59.008557 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"363aedfbff130cfe6390afc42cadb4beb518e0ccc9d7a83d645be5b408d785ae"} Mar 13 09:11:59 crc kubenswrapper[4841]: I0313 09:11:59.009951 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dfa2095b1379332805539c1d553f3e010bd9cb27419b52c4267f6f1ae2f4efb1"} Mar 13 09:11:59 crc kubenswrapper[4841]: W0313 09:11:59.160186 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 13 09:11:59 crc kubenswrapper[4841]: E0313 09:11:59.160790 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 13 09:11:59 crc kubenswrapper[4841]: W0313 09:11:59.207518 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 13 09:11:59 crc kubenswrapper[4841]: E0313 09:11:59.207627 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 13 09:11:59 crc kubenswrapper[4841]: E0313 09:11:59.337989 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="1.6s" Mar 13 09:11:59 crc kubenswrapper[4841]: W0313 09:11:59.550629 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 13 09:11:59 crc kubenswrapper[4841]: E0313 09:11:59.550747 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 13 09:11:59 crc kubenswrapper[4841]: I0313 09:11:59.569253 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:11:59 crc kubenswrapper[4841]: I0313 09:11:59.572635 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:11:59 crc kubenswrapper[4841]: I0313 09:11:59.572693 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:11:59 crc kubenswrapper[4841]: I0313 09:11:59.572706 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:11:59 crc kubenswrapper[4841]: I0313 09:11:59.572733 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 09:11:59 crc kubenswrapper[4841]: E0313 09:11:59.573945 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Mar 13 09:11:59 crc kubenswrapper[4841]: I0313 09:11:59.918027 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 13 09:11:59 crc kubenswrapper[4841]: I0313 09:11:59.968035 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 09:11:59 crc kubenswrapper[4841]: E0313 09:11:59.971130 4841 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 13 09:12:00 crc kubenswrapper[4841]: E0313 09:12:00.000408 4841 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.106:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189c5ba51832d479 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.910848633 +0000 UTC m=+0.640748864,LastTimestamp:2026-03-13 09:11:57.910848633 +0000 UTC m=+0.640748864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.018732 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7408253e38e2ed73ba8483cc0114f0e210924d045466866931809cfe30deb850"} Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.018795 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d16119605de1f62b8d63a541d54d65d0b9ece2e285c6012dcf5a56b8aee75482"} Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.018807 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f529bc95349bfcff9cc46f3dd5510b44dd11e4916999c0bcb7367b6b7b2beece"} Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.018817 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"835d958e1267f01e98aac57897a03c29b574dc64de5080416c896cd2f52430be"} Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.018893 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.020988 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.021031 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.021044 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.021918 4841 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="590511c57fa67e49a73d49d811f29ae079b6368c4d0654717bcfe4875adc22f7" exitCode=0 Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.022058 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"590511c57fa67e49a73d49d811f29ae079b6368c4d0654717bcfe4875adc22f7"} Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.022109 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.023975 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.024041 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.024063 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.024675 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c" exitCode=0 Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.024755 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c"} Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.024860 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.025823 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.025864 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.025878 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.028318 4841 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6" exitCode=0 Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.028377 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6"} Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.028462 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.028551 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.029781 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.029802 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.029812 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.032142 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.032212 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.032241 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.041580 4841 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="214377d366263c66ab1db5d541f1cb6d70dda6c79c2b40ed2e051af0d33e7f3f" exitCode=0 Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.041704 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"214377d366263c66ab1db5d541f1cb6d70dda6c79c2b40ed2e051af0d33e7f3f"} Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.041758 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.043338 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.043387 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.043407 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:00 crc kubenswrapper[4841]: W0313 09:12:00.852084 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 13 09:12:00 crc kubenswrapper[4841]: E0313 09:12:00.852146 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 13 09:12:00 crc kubenswrapper[4841]: I0313 09:12:00.917670 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 13 09:12:00 crc kubenswrapper[4841]: E0313 09:12:00.938731 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="3.2s" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.047218 4841 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0b38bf48d0a30b1d102b2b12b63674ccdce2d5e2930d055c3963682b91d05d93" exitCode=0 Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.047315 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0b38bf48d0a30b1d102b2b12b63674ccdce2d5e2930d055c3963682b91d05d93"} Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.047457 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.048460 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.048490 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.048503 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.055648 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc"} Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.055696 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be"} Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.055716 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c"} Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.055732 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f"} Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.069489 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ffd735570021970d7d0266d4fb83355f1756bc3f8865ccfef1cd9af56e4a27b6"} Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.069526 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.074367 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.074433 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.074454 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.078826 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1b168e9e326c08a472e1f5bd3613b3ef8e04dccd8413c39afec4de7b77acbc90"} Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.078901 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.078924 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2676f12937618d46e754871a788463251dfee0af7f0448de0de7b7edfe9896f9"} Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.078957 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"be538268601c5d42bac857e8afe0c1a236ad85f3c98d4598c3a7e82490be4463"} Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.078873 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.081024 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.081071 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.081085 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.081703 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.082831 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.082865 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.082881 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:01 crc kubenswrapper[4841]: W0313 09:12:01.105005 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 13 09:12:01 crc kubenswrapper[4841]: E0313 09:12:01.105118 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.174775 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.175920 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.175964 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.175977 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:01 crc kubenswrapper[4841]: I0313 09:12:01.176004 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 09:12:01 crc kubenswrapper[4841]: E0313 09:12:01.176503 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Mar 13 09:12:01 crc kubenswrapper[4841]: W0313 09:12:01.440955 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Mar 13 09:12:01 crc kubenswrapper[4841]: E0313 09:12:01.441098 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.084718 4841 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3fc09bbd89cb0129ffe1030a0a36d661693c956eea136016aa90acb43a6ae684" exitCode=0 Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.084811 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3fc09bbd89cb0129ffe1030a0a36d661693c956eea136016aa90acb43a6ae684"} Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.084880 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.086215 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.086253 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.086287 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.090349 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ffce8058669c1502c30a545a58c17fe27e2ae06dab7e7b04d87a50370c55aebd"} Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.090434 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.090461 4841 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.090490 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.090503 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.090825 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.091652 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.091677 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.091686 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.091987 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.092026 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.092042 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.092071 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.092106 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.092122 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.092925 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.092958 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:02 crc kubenswrapper[4841]: I0313 09:12:02.092976 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:03 crc kubenswrapper[4841]: I0313 09:12:03.100531 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fab9bddc78b00dff9b2fa8e6cd866192e0f4fd50674210a02c5b7c1adf80083a"} Mar 13 09:12:03 crc kubenswrapper[4841]: I0313 09:12:03.100595 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"68ec919eef616ef7c0b2294d8dee8897e8a317416ea98f51ec6d0cac4b999aea"} Mar 13 09:12:03 crc kubenswrapper[4841]: I0313 09:12:03.100615 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a48eb24b3fa4340905330f28451fbef8e2c41bccfe4cad4f4ec8feeaee855f60"} Mar 13 09:12:03 crc kubenswrapper[4841]: I0313 09:12:03.100635 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9b29112f33e33bfd413b43a730aa951f1e55edaf76bf762dce2342b833382567"} Mar 13 09:12:03 crc kubenswrapper[4841]: I0313 09:12:03.100754 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:12:03 crc kubenswrapper[4841]: I0313 09:12:03.100851 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:03 crc kubenswrapper[4841]: I0313 09:12:03.102060 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:03 crc kubenswrapper[4841]: I0313 09:12:03.102112 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:03 crc kubenswrapper[4841]: I0313 09:12:03.102130 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:03 crc kubenswrapper[4841]: I0313 09:12:03.988312 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:12:03 crc kubenswrapper[4841]: I0313 09:12:03.988526 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:03 crc kubenswrapper[4841]: I0313 09:12:03.990042 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:03 crc kubenswrapper[4841]: I0313 09:12:03.990073 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:03 crc kubenswrapper[4841]: I0313 09:12:03.990081 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:03 crc kubenswrapper[4841]: I0313 09:12:03.994815 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.081349 4841 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.081438 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.107427 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.108242 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.108365 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.108372 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.108686 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"df6f29ab90584cbd7a9f3dd03d31693cefa4b06213817745a3fd8305f08009d1"} Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.109731 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.109758 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.109767 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.109793 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.109832 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.109856 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.110640 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.110665 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.110673 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.244609 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.377005 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.378230 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.378296 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.378312 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:04 crc kubenswrapper[4841]: I0313 09:12:04.378333 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 09:12:05 crc kubenswrapper[4841]: I0313 09:12:05.111712 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:05 crc kubenswrapper[4841]: I0313 09:12:05.111902 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:05 crc kubenswrapper[4841]: I0313 09:12:05.113090 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:05 crc kubenswrapper[4841]: I0313 09:12:05.113123 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:05 crc kubenswrapper[4841]: I0313 09:12:05.113134 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:05 crc kubenswrapper[4841]: I0313 09:12:05.113214 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:05 crc kubenswrapper[4841]: I0313 09:12:05.113251 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:05 crc kubenswrapper[4841]: I0313 09:12:05.113308 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:05 crc kubenswrapper[4841]: I0313 09:12:05.228941 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:12:05 crc kubenswrapper[4841]: I0313 09:12:05.229068 4841 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 09:12:05 crc kubenswrapper[4841]: I0313 09:12:05.229110 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:05 crc kubenswrapper[4841]: I0313 09:12:05.230485 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:05 crc kubenswrapper[4841]: I0313 09:12:05.230544 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:05 crc kubenswrapper[4841]: I0313 09:12:05.230556 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:05 crc kubenswrapper[4841]: I0313 09:12:05.428554 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:12:06 crc kubenswrapper[4841]: I0313 09:12:06.114623 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:06 crc kubenswrapper[4841]: I0313 09:12:06.115699 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:06 crc kubenswrapper[4841]: I0313 09:12:06.115751 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:06 crc kubenswrapper[4841]: I0313 09:12:06.115769 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:06 crc kubenswrapper[4841]: I0313 09:12:06.238004 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 09:12:06 crc kubenswrapper[4841]: I0313 09:12:06.238231 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:06 crc kubenswrapper[4841]: I0313 09:12:06.239790 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:06 crc kubenswrapper[4841]: I0313 09:12:06.239869 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:06 crc kubenswrapper[4841]: I0313 09:12:06.239897 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:07 crc kubenswrapper[4841]: I0313 09:12:07.403367 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:12:07 crc kubenswrapper[4841]: I0313 09:12:07.403523 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:07 crc kubenswrapper[4841]: I0313 09:12:07.404814 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:07 crc kubenswrapper[4841]: I0313 09:12:07.404874 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:07 crc kubenswrapper[4841]: I0313 09:12:07.404886 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:08 crc kubenswrapper[4841]: E0313 09:12:08.074015 4841 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 09:12:08 crc kubenswrapper[4841]: I0313 09:12:08.863951 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 13 09:12:08 crc kubenswrapper[4841]: I0313 09:12:08.864252 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:08 crc kubenswrapper[4841]: I0313 09:12:08.865756 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:08 crc kubenswrapper[4841]: I0313 09:12:08.865851 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:08 crc kubenswrapper[4841]: I0313 09:12:08.865869 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:11 crc kubenswrapper[4841]: W0313 09:12:11.893439 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 13 09:12:11 crc kubenswrapper[4841]: I0313 09:12:11.893607 4841 trace.go:236] Trace[1151123430]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Mar-2026 09:12:01.891) (total time: 10001ms): Mar 13 09:12:11 crc kubenswrapper[4841]: Trace[1151123430]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:12:11.893) Mar 13 09:12:11 crc kubenswrapper[4841]: Trace[1151123430]: [10.001944051s] [10.001944051s] END Mar 13 09:12:11 crc kubenswrapper[4841]: E0313 09:12:11.893655 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 13 09:12:11 crc kubenswrapper[4841]: I0313 09:12:11.918233 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 13 09:12:12 crc kubenswrapper[4841]: W0313 09:12:12.009441 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:12Z is after 2026-02-23T05:33:13Z Mar 13 09:12:12 crc kubenswrapper[4841]: E0313 09:12:12.009522 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 09:12:12 crc kubenswrapper[4841]: I0313 09:12:12.011609 4841 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 09:12:12 crc kubenswrapper[4841]: I0313 09:12:12.011687 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 13 09:12:12 crc kubenswrapper[4841]: E0313 09:12:12.014915 4841 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 09:12:12 crc kubenswrapper[4841]: E0313 09:12:12.019189 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:12Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 13 09:12:12 crc kubenswrapper[4841]: E0313 09:12:12.019591 4841 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:12Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c5ba51832d479 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.910848633 +0000 UTC m=+0.640748864,LastTimestamp:2026-03-13 09:11:57.910848633 +0000 UTC m=+0.640748864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:12:12 crc kubenswrapper[4841]: I0313 09:12:12.021548 4841 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 09:12:12 crc kubenswrapper[4841]: I0313 09:12:12.021615 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 13 09:12:12 crc kubenswrapper[4841]: W0313 09:12:12.023977 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:12Z is after 2026-02-23T05:33:13Z Mar 13 09:12:12 crc kubenswrapper[4841]: E0313 09:12:12.024097 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 09:12:12 crc kubenswrapper[4841]: E0313 09:12:12.024554 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:12Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 09:12:12 crc kubenswrapper[4841]: W0313 09:12:12.028936 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:12Z is after 2026-02-23T05:33:13Z Mar 13 09:12:12 crc kubenswrapper[4841]: E0313 09:12:12.029023 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 09:12:12 crc kubenswrapper[4841]: I0313 09:12:12.132105 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 09:12:12 crc kubenswrapper[4841]: I0313 09:12:12.134355 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ffce8058669c1502c30a545a58c17fe27e2ae06dab7e7b04d87a50370c55aebd" exitCode=255 Mar 13 09:12:12 crc kubenswrapper[4841]: I0313 09:12:12.134394 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ffce8058669c1502c30a545a58c17fe27e2ae06dab7e7b04d87a50370c55aebd"} Mar 13 09:12:12 crc kubenswrapper[4841]: I0313 09:12:12.134544 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:12 crc kubenswrapper[4841]: I0313 09:12:12.135514 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:12 crc kubenswrapper[4841]: I0313 09:12:12.135547 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:12 crc kubenswrapper[4841]: I0313 09:12:12.135556 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:12 crc kubenswrapper[4841]: I0313 09:12:12.136038 4841 scope.go:117] "RemoveContainer" containerID="ffce8058669c1502c30a545a58c17fe27e2ae06dab7e7b04d87a50370c55aebd" Mar 13 09:12:12 crc kubenswrapper[4841]: I0313 09:12:12.926602 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:12Z is after 2026-02-23T05:33:13Z Mar 13 09:12:13 crc kubenswrapper[4841]: I0313 09:12:13.138129 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 09:12:13 crc kubenswrapper[4841]: I0313 09:12:13.140082 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b750bb9d88bbedbd5b0f3bd5b988fa43effea6d30a044679e365485f2286e2bc"} Mar 13 09:12:13 crc kubenswrapper[4841]: I0313 09:12:13.140243 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:13 crc kubenswrapper[4841]: I0313 09:12:13.141207 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:13 crc kubenswrapper[4841]: I0313 09:12:13.141314 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:13 crc kubenswrapper[4841]: I0313 09:12:13.141409 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:13 crc kubenswrapper[4841]: I0313 09:12:13.667742 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 13 09:12:13 crc kubenswrapper[4841]: I0313 09:12:13.668032 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:13 crc kubenswrapper[4841]: I0313 09:12:13.669573 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:13 crc kubenswrapper[4841]: I0313 09:12:13.669618 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:13 crc kubenswrapper[4841]: I0313 09:12:13.669637 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:13 crc kubenswrapper[4841]: I0313 09:12:13.704318 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 13 09:12:13 crc kubenswrapper[4841]: I0313 09:12:13.884886 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 13 09:12:13 crc kubenswrapper[4841]: I0313 09:12:13.923758 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:13Z is after 2026-02-23T05:33:13Z Mar 13 09:12:14 crc kubenswrapper[4841]: I0313 09:12:14.082346 4841 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 09:12:14 crc kubenswrapper[4841]: I0313 09:12:14.082462 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 09:12:14 crc kubenswrapper[4841]: I0313 09:12:14.146578 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 09:12:14 crc kubenswrapper[4841]: I0313 09:12:14.147620 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 09:12:14 crc kubenswrapper[4841]: I0313 09:12:14.150110 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b750bb9d88bbedbd5b0f3bd5b988fa43effea6d30a044679e365485f2286e2bc" exitCode=255 Mar 13 09:12:14 crc kubenswrapper[4841]: I0313 09:12:14.150237 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b750bb9d88bbedbd5b0f3bd5b988fa43effea6d30a044679e365485f2286e2bc"} Mar 13 09:12:14 crc kubenswrapper[4841]: I0313 09:12:14.150341 4841 scope.go:117] "RemoveContainer" containerID="ffce8058669c1502c30a545a58c17fe27e2ae06dab7e7b04d87a50370c55aebd" Mar 13 09:12:14 crc kubenswrapper[4841]: I0313 09:12:14.150384 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:14 crc kubenswrapper[4841]: I0313 09:12:14.150560 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:14 crc kubenswrapper[4841]: I0313 09:12:14.153927 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:14 crc kubenswrapper[4841]: I0313 09:12:14.153993 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:14 crc kubenswrapper[4841]: I0313 09:12:14.154037 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:14 crc kubenswrapper[4841]: I0313 09:12:14.154045 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:14 crc kubenswrapper[4841]: I0313 09:12:14.154106 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:14 crc kubenswrapper[4841]: I0313 09:12:14.154139 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:14 crc kubenswrapper[4841]: I0313 09:12:14.155491 4841 scope.go:117] "RemoveContainer" containerID="b750bb9d88bbedbd5b0f3bd5b988fa43effea6d30a044679e365485f2286e2bc" Mar 13 09:12:14 crc kubenswrapper[4841]: E0313 09:12:14.157916 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:12:14 crc kubenswrapper[4841]: I0313 09:12:14.922990 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:14Z is after 2026-02-23T05:33:13Z Mar 13 09:12:15 crc kubenswrapper[4841]: I0313 09:12:15.154605 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 09:12:15 crc kubenswrapper[4841]: I0313 09:12:15.157911 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:15 crc kubenswrapper[4841]: I0313 09:12:15.159233 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:15 crc kubenswrapper[4841]: I0313 09:12:15.159329 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:15 crc kubenswrapper[4841]: I0313 09:12:15.159353 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:15 crc kubenswrapper[4841]: W0313 09:12:15.425883 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:15Z is after 2026-02-23T05:33:13Z Mar 13 09:12:15 crc kubenswrapper[4841]: E0313 09:12:15.425976 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 09:12:15 crc kubenswrapper[4841]: I0313 09:12:15.435071 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:12:15 crc kubenswrapper[4841]: I0313 09:12:15.435349 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:15 crc kubenswrapper[4841]: I0313 09:12:15.437189 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:15 crc kubenswrapper[4841]: I0313 09:12:15.437253 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:15 crc kubenswrapper[4841]: I0313 09:12:15.437307 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:15 crc kubenswrapper[4841]: I0313 09:12:15.438236 4841 scope.go:117] "RemoveContainer" containerID="b750bb9d88bbedbd5b0f3bd5b988fa43effea6d30a044679e365485f2286e2bc" Mar 13 09:12:15 crc kubenswrapper[4841]: E0313 09:12:15.438577 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:12:15 crc kubenswrapper[4841]: I0313 09:12:15.440394 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:12:15 crc kubenswrapper[4841]: I0313 09:12:15.481151 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:12:15 crc kubenswrapper[4841]: I0313 09:12:15.921052 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:15Z is after 2026-02-23T05:33:13Z Mar 13 09:12:16 crc kubenswrapper[4841]: I0313 09:12:16.161329 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:16 crc kubenswrapper[4841]: I0313 09:12:16.162553 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:16 crc kubenswrapper[4841]: I0313 09:12:16.162589 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:16 crc kubenswrapper[4841]: I0313 09:12:16.162597 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:16 crc kubenswrapper[4841]: I0313 09:12:16.163129 4841 scope.go:117] "RemoveContainer" containerID="b750bb9d88bbedbd5b0f3bd5b988fa43effea6d30a044679e365485f2286e2bc" Mar 13 09:12:16 crc kubenswrapper[4841]: E0313 09:12:16.163310 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:12:16 crc kubenswrapper[4841]: I0313 09:12:16.923008 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:16Z is after 2026-02-23T05:33:13Z Mar 13 09:12:17 crc kubenswrapper[4841]: I0313 09:12:17.163947 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:17 crc kubenswrapper[4841]: I0313 09:12:17.165183 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:17 crc kubenswrapper[4841]: I0313 09:12:17.165350 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:17 crc kubenswrapper[4841]: I0313 09:12:17.165376 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:17 crc kubenswrapper[4841]: I0313 09:12:17.166221 4841 scope.go:117] "RemoveContainer" containerID="b750bb9d88bbedbd5b0f3bd5b988fa43effea6d30a044679e365485f2286e2bc" Mar 13 09:12:17 crc kubenswrapper[4841]: E0313 09:12:17.166532 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:12:17 crc kubenswrapper[4841]: I0313 09:12:17.409587 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:12:17 crc kubenswrapper[4841]: I0313 09:12:17.409792 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:17 crc kubenswrapper[4841]: I0313 09:12:17.411157 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:17 crc kubenswrapper[4841]: I0313 09:12:17.411206 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:17 crc kubenswrapper[4841]: I0313 09:12:17.411223 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:17 crc kubenswrapper[4841]: I0313 09:12:17.922051 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:17Z is after 2026-02-23T05:33:13Z Mar 13 09:12:18 crc kubenswrapper[4841]: E0313 09:12:18.074183 4841 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 09:12:18 crc kubenswrapper[4841]: I0313 09:12:18.425606 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:18 crc kubenswrapper[4841]: E0313 09:12:18.426676 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:18Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 09:12:18 crc kubenswrapper[4841]: I0313 09:12:18.427351 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:18 crc kubenswrapper[4841]: I0313 09:12:18.427415 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:18 crc kubenswrapper[4841]: I0313 09:12:18.427443 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:18 crc kubenswrapper[4841]: I0313 09:12:18.427485 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 09:12:18 crc kubenswrapper[4841]: E0313 09:12:18.432977 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:18Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 09:12:18 crc kubenswrapper[4841]: W0313 09:12:18.439123 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:18Z is after 2026-02-23T05:33:13Z Mar 13 09:12:18 crc kubenswrapper[4841]: E0313 09:12:18.439212 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 09:12:18 crc kubenswrapper[4841]: I0313 09:12:18.919858 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:18Z is after 2026-02-23T05:33:13Z Mar 13 09:12:19 crc kubenswrapper[4841]: I0313 09:12:19.923038 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:19Z is after 2026-02-23T05:33:13Z Mar 13 09:12:20 crc kubenswrapper[4841]: I0313 09:12:20.268961 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:12:20 crc kubenswrapper[4841]: I0313 09:12:20.269236 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:20 crc kubenswrapper[4841]: I0313 09:12:20.271071 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:20 crc kubenswrapper[4841]: I0313 09:12:20.271129 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:20 crc kubenswrapper[4841]: I0313 09:12:20.271146 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:20 crc kubenswrapper[4841]: I0313 09:12:20.272084 4841 scope.go:117] "RemoveContainer" containerID="b750bb9d88bbedbd5b0f3bd5b988fa43effea6d30a044679e365485f2286e2bc" Mar 13 09:12:20 crc kubenswrapper[4841]: E0313 09:12:20.272395 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:12:20 crc kubenswrapper[4841]: I0313 09:12:20.462724 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 09:12:20 crc kubenswrapper[4841]: E0313 09:12:20.468699 4841 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 09:12:20 crc kubenswrapper[4841]: I0313 09:12:20.922079 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:20Z is after 2026-02-23T05:33:13Z Mar 13 09:12:21 crc kubenswrapper[4841]: W0313 09:12:21.014082 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:21Z is after 2026-02-23T05:33:13Z Mar 13 09:12:21 crc kubenswrapper[4841]: E0313 09:12:21.014186 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 09:12:21 crc kubenswrapper[4841]: W0313 09:12:21.316466 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:21Z is after 2026-02-23T05:33:13Z Mar 13 09:12:21 crc kubenswrapper[4841]: E0313 09:12:21.316861 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 09:12:21 crc kubenswrapper[4841]: I0313 09:12:21.922744 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:21Z is after 2026-02-23T05:33:13Z Mar 13 09:12:22 crc kubenswrapper[4841]: E0313 09:12:22.025414 4841 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:22Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c5ba51832d479 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.910848633 +0000 UTC m=+0.640748864,LastTimestamp:2026-03-13 09:11:57.910848633 +0000 UTC m=+0.640748864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:12:22 crc kubenswrapper[4841]: I0313 09:12:22.922012 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:22Z is after 2026-02-23T05:33:13Z Mar 13 09:12:23 crc kubenswrapper[4841]: I0313 09:12:23.922414 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:23Z is after 2026-02-23T05:33:13Z Mar 13 09:12:23 crc kubenswrapper[4841]: W0313 09:12:23.983153 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:23Z is after 2026-02-23T05:33:13Z Mar 13 09:12:23 crc kubenswrapper[4841]: E0313 09:12:23.983295 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 09:12:24 crc kubenswrapper[4841]: I0313 09:12:24.082393 4841 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 09:12:24 crc kubenswrapper[4841]: I0313 09:12:24.082466 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 09:12:24 crc kubenswrapper[4841]: I0313 09:12:24.082534 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:12:24 crc kubenswrapper[4841]: I0313 09:12:24.082704 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:24 crc kubenswrapper[4841]: I0313 09:12:24.084221 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:24 crc kubenswrapper[4841]: I0313 09:12:24.084326 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:24 crc kubenswrapper[4841]: I0313 09:12:24.084347 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:24 crc kubenswrapper[4841]: I0313 09:12:24.085163 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"f529bc95349bfcff9cc46f3dd5510b44dd11e4916999c0bcb7367b6b7b2beece"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 09:12:24 crc kubenswrapper[4841]: I0313 09:12:24.085469 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://f529bc95349bfcff9cc46f3dd5510b44dd11e4916999c0bcb7367b6b7b2beece" gracePeriod=30 Mar 13 09:12:24 crc kubenswrapper[4841]: I0313 09:12:24.920625 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:24Z is after 2026-02-23T05:33:13Z Mar 13 09:12:25 crc kubenswrapper[4841]: I0313 09:12:25.189365 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 09:12:25 crc kubenswrapper[4841]: I0313 09:12:25.190761 4841 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f529bc95349bfcff9cc46f3dd5510b44dd11e4916999c0bcb7367b6b7b2beece" exitCode=255 Mar 13 09:12:25 crc kubenswrapper[4841]: I0313 09:12:25.190836 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f529bc95349bfcff9cc46f3dd5510b44dd11e4916999c0bcb7367b6b7b2beece"} Mar 13 09:12:25 crc kubenswrapper[4841]: I0313 09:12:25.190886 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b48a1313ef08ab3065fd803efd4e71b468dc9dc0c887af4c705397d07bc3444f"} Mar 13 09:12:25 crc kubenswrapper[4841]: I0313 09:12:25.191042 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:25 crc kubenswrapper[4841]: I0313 09:12:25.192427 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:25 crc kubenswrapper[4841]: I0313 09:12:25.192485 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:25 crc kubenswrapper[4841]: I0313 09:12:25.192508 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:25 crc kubenswrapper[4841]: I0313 09:12:25.228969 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:12:25 crc kubenswrapper[4841]: E0313 09:12:25.432415 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:25Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 09:12:25 crc kubenswrapper[4841]: I0313 09:12:25.433599 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:25 crc kubenswrapper[4841]: I0313 09:12:25.435414 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:25 crc kubenswrapper[4841]: I0313 09:12:25.435486 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:25 crc kubenswrapper[4841]: I0313 09:12:25.435504 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:25 crc kubenswrapper[4841]: I0313 09:12:25.435542 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 09:12:25 crc kubenswrapper[4841]: E0313 09:12:25.438408 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:25Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 09:12:25 crc kubenswrapper[4841]: I0313 09:12:25.924679 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:25Z is after 2026-02-23T05:33:13Z Mar 13 09:12:26 crc kubenswrapper[4841]: I0313 09:12:26.193594 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:26 crc kubenswrapper[4841]: I0313 09:12:26.196248 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:26 crc kubenswrapper[4841]: I0313 09:12:26.196405 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:26 crc kubenswrapper[4841]: I0313 09:12:26.196426 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:26 crc kubenswrapper[4841]: I0313 09:12:26.922084 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:26Z is after 2026-02-23T05:33:13Z Mar 13 09:12:27 crc kubenswrapper[4841]: I0313 09:12:27.196788 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:27 crc kubenswrapper[4841]: I0313 09:12:27.198035 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:27 crc kubenswrapper[4841]: I0313 09:12:27.198110 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:27 crc kubenswrapper[4841]: I0313 09:12:27.198134 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:27 crc kubenswrapper[4841]: I0313 09:12:27.923103 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:27Z is after 2026-02-23T05:33:13Z Mar 13 09:12:28 crc kubenswrapper[4841]: E0313 09:12:28.074701 4841 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 09:12:28 crc kubenswrapper[4841]: I0313 09:12:28.923592 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:28Z is after 2026-02-23T05:33:13Z Mar 13 09:12:29 crc kubenswrapper[4841]: I0313 09:12:29.922650 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:29Z is after 2026-02-23T05:33:13Z Mar 13 09:12:30 crc kubenswrapper[4841]: I0313 09:12:30.922251 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:30Z is after 2026-02-23T05:33:13Z Mar 13 09:12:31 crc kubenswrapper[4841]: I0313 09:12:31.081662 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:12:31 crc kubenswrapper[4841]: I0313 09:12:31.081927 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:31 crc kubenswrapper[4841]: I0313 09:12:31.083562 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:31 crc kubenswrapper[4841]: I0313 09:12:31.083662 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:31 crc kubenswrapper[4841]: I0313 09:12:31.083688 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:31 crc kubenswrapper[4841]: I0313 09:12:31.920591 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:31Z is after 2026-02-23T05:33:13Z Mar 13 09:12:32 crc kubenswrapper[4841]: E0313 09:12:32.029142 4841 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:32Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c5ba51832d479 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.910848633 +0000 UTC m=+0.640748864,LastTimestamp:2026-03-13 09:11:57.910848633 +0000 UTC m=+0.640748864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:12:32 crc kubenswrapper[4841]: I0313 09:12:32.438643 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:32 crc kubenswrapper[4841]: E0313 09:12:32.439686 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:32Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 09:12:32 crc kubenswrapper[4841]: I0313 09:12:32.440473 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:32 crc kubenswrapper[4841]: I0313 09:12:32.440531 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:32 crc kubenswrapper[4841]: I0313 09:12:32.440551 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:32 crc kubenswrapper[4841]: I0313 09:12:32.440585 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 09:12:32 crc kubenswrapper[4841]: E0313 09:12:32.445715 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:32Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 09:12:32 crc kubenswrapper[4841]: I0313 09:12:32.921226 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:32Z is after 2026-02-23T05:33:13Z Mar 13 09:12:33 crc kubenswrapper[4841]: I0313 09:12:33.921992 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:33Z is after 2026-02-23T05:33:13Z Mar 13 09:12:33 crc kubenswrapper[4841]: I0313 09:12:33.994521 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:33 crc kubenswrapper[4841]: I0313 09:12:33.996014 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:33 crc kubenswrapper[4841]: I0313 09:12:33.996093 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:33 crc kubenswrapper[4841]: I0313 09:12:33.996122 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:33 crc kubenswrapper[4841]: I0313 09:12:33.997195 4841 scope.go:117] "RemoveContainer" containerID="b750bb9d88bbedbd5b0f3bd5b988fa43effea6d30a044679e365485f2286e2bc" Mar 13 09:12:34 crc kubenswrapper[4841]: I0313 09:12:34.082129 4841 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 09:12:34 crc kubenswrapper[4841]: I0313 09:12:34.082340 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 09:12:34 crc kubenswrapper[4841]: I0313 09:12:34.921798 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:34Z is after 2026-02-23T05:33:13Z Mar 13 09:12:35 crc kubenswrapper[4841]: I0313 09:12:35.221221 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 09:12:35 crc kubenswrapper[4841]: I0313 09:12:35.222172 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 09:12:35 crc kubenswrapper[4841]: I0313 09:12:35.224871 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="49595d479a1054602923200c6ced6058330b4e5b074ea8a6994405af9529f64d" exitCode=255 Mar 13 09:12:35 crc kubenswrapper[4841]: I0313 09:12:35.224941 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"49595d479a1054602923200c6ced6058330b4e5b074ea8a6994405af9529f64d"} Mar 13 09:12:35 crc kubenswrapper[4841]: I0313 09:12:35.224994 4841 scope.go:117] "RemoveContainer" containerID="b750bb9d88bbedbd5b0f3bd5b988fa43effea6d30a044679e365485f2286e2bc" Mar 13 09:12:35 crc kubenswrapper[4841]: I0313 09:12:35.225207 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:35 crc kubenswrapper[4841]: I0313 09:12:35.227079 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:35 crc kubenswrapper[4841]: I0313 09:12:35.227152 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:35 crc kubenswrapper[4841]: I0313 09:12:35.227177 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:35 crc kubenswrapper[4841]: I0313 09:12:35.228235 4841 scope.go:117] "RemoveContainer" containerID="49595d479a1054602923200c6ced6058330b4e5b074ea8a6994405af9529f64d" Mar 13 09:12:35 crc kubenswrapper[4841]: E0313 09:12:35.228605 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:12:35 crc kubenswrapper[4841]: I0313 09:12:35.480921 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:12:35 crc kubenswrapper[4841]: W0313 09:12:35.850366 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:35Z is after 2026-02-23T05:33:13Z Mar 13 09:12:35 crc kubenswrapper[4841]: E0313 09:12:35.850472 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 09:12:35 crc kubenswrapper[4841]: I0313 09:12:35.923350 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:35Z is after 2026-02-23T05:33:13Z Mar 13 09:12:36 crc kubenswrapper[4841]: I0313 09:12:36.231197 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 09:12:36 crc kubenswrapper[4841]: I0313 09:12:36.234601 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:36 crc kubenswrapper[4841]: I0313 09:12:36.236044 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:36 crc kubenswrapper[4841]: I0313 09:12:36.236200 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:36 crc kubenswrapper[4841]: I0313 09:12:36.236338 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:36 crc kubenswrapper[4841]: I0313 09:12:36.237364 4841 scope.go:117] "RemoveContainer" containerID="49595d479a1054602923200c6ced6058330b4e5b074ea8a6994405af9529f64d" Mar 13 09:12:36 crc kubenswrapper[4841]: E0313 09:12:36.237762 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:12:36 crc kubenswrapper[4841]: I0313 09:12:36.922578 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:36Z is after 2026-02-23T05:33:13Z Mar 13 09:12:37 crc kubenswrapper[4841]: I0313 09:12:37.152720 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 09:12:37 crc kubenswrapper[4841]: E0313 09:12:37.159024 4841 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 09:12:37 crc kubenswrapper[4841]: E0313 09:12:37.160294 4841 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 13 09:12:37 crc kubenswrapper[4841]: W0313 09:12:37.556619 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:37Z is after 2026-02-23T05:33:13Z Mar 13 09:12:37 crc kubenswrapper[4841]: E0313 09:12:37.556698 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 09:12:37 crc kubenswrapper[4841]: I0313 09:12:37.918759 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:37Z is after 2026-02-23T05:33:13Z Mar 13 09:12:38 crc kubenswrapper[4841]: E0313 09:12:38.074914 4841 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 09:12:38 crc kubenswrapper[4841]: W0313 09:12:38.640022 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:38Z is after 2026-02-23T05:33:13Z Mar 13 09:12:38 crc kubenswrapper[4841]: E0313 09:12:38.640111 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 09:12:38 crc kubenswrapper[4841]: I0313 09:12:38.922184 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:38Z is after 2026-02-23T05:33:13Z Mar 13 09:12:39 crc kubenswrapper[4841]: E0313 09:12:39.445379 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:39Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 09:12:39 crc kubenswrapper[4841]: I0313 09:12:39.446385 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:39 crc kubenswrapper[4841]: I0313 09:12:39.448213 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:39 crc kubenswrapper[4841]: I0313 09:12:39.448321 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:39 crc kubenswrapper[4841]: I0313 09:12:39.448348 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:39 crc kubenswrapper[4841]: I0313 09:12:39.448391 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 09:12:39 crc kubenswrapper[4841]: E0313 09:12:39.452022 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:39Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 09:12:39 crc kubenswrapper[4841]: I0313 09:12:39.921840 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:39Z is after 2026-02-23T05:33:13Z Mar 13 09:12:40 crc kubenswrapper[4841]: I0313 09:12:40.269155 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:12:40 crc kubenswrapper[4841]: I0313 09:12:40.269439 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:40 crc kubenswrapper[4841]: I0313 09:12:40.270710 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:40 crc kubenswrapper[4841]: I0313 09:12:40.270758 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:40 crc kubenswrapper[4841]: I0313 09:12:40.270777 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:40 crc kubenswrapper[4841]: I0313 09:12:40.271572 4841 scope.go:117] "RemoveContainer" containerID="49595d479a1054602923200c6ced6058330b4e5b074ea8a6994405af9529f64d" Mar 13 09:12:40 crc kubenswrapper[4841]: E0313 09:12:40.271844 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:12:40 crc kubenswrapper[4841]: I0313 09:12:40.922621 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:40Z is after 2026-02-23T05:33:13Z Mar 13 09:12:41 crc kubenswrapper[4841]: I0313 09:12:41.922344 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:41Z is after 2026-02-23T05:33:13Z Mar 13 09:12:42 crc kubenswrapper[4841]: E0313 09:12:42.032906 4841 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:42Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c5ba51832d479 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.910848633 +0000 UTC m=+0.640748864,LastTimestamp:2026-03-13 09:11:57.910848633 +0000 UTC m=+0.640748864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:12:42 crc kubenswrapper[4841]: I0313 09:12:42.922352 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:42Z is after 2026-02-23T05:33:13Z Mar 13 09:12:43 crc kubenswrapper[4841]: I0313 09:12:43.922761 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:43Z is after 2026-02-23T05:33:13Z Mar 13 09:12:44 crc kubenswrapper[4841]: I0313 09:12:44.088609 4841 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 09:12:44 crc kubenswrapper[4841]: I0313 09:12:44.088796 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 09:12:44 crc kubenswrapper[4841]: I0313 09:12:44.920831 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:44Z is after 2026-02-23T05:33:13Z Mar 13 09:12:45 crc kubenswrapper[4841]: I0313 09:12:45.922607 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:45Z is after 2026-02-23T05:33:13Z Mar 13 09:12:46 crc kubenswrapper[4841]: I0313 09:12:46.245993 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 09:12:46 crc kubenswrapper[4841]: I0313 09:12:46.246234 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:46 crc kubenswrapper[4841]: I0313 09:12:46.247925 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:46 crc kubenswrapper[4841]: I0313 09:12:46.247989 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:46 crc kubenswrapper[4841]: I0313 09:12:46.248007 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:46 crc kubenswrapper[4841]: W0313 09:12:46.376551 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:46Z is after 2026-02-23T05:33:13Z Mar 13 09:12:46 crc kubenswrapper[4841]: E0313 09:12:46.376736 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 09:12:46 crc kubenswrapper[4841]: E0313 09:12:46.448933 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:46Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 09:12:46 crc kubenswrapper[4841]: I0313 09:12:46.453224 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:46 crc kubenswrapper[4841]: I0313 09:12:46.454601 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:46 crc kubenswrapper[4841]: I0313 09:12:46.454651 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:46 crc kubenswrapper[4841]: I0313 09:12:46.454668 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:46 crc kubenswrapper[4841]: I0313 09:12:46.454701 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 09:12:46 crc kubenswrapper[4841]: E0313 09:12:46.459615 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:46Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 09:12:46 crc kubenswrapper[4841]: I0313 09:12:46.922914 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:46Z is after 2026-02-23T05:33:13Z Mar 13 09:12:47 crc kubenswrapper[4841]: I0313 09:12:47.922417 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:47Z is after 2026-02-23T05:33:13Z Mar 13 09:12:48 crc kubenswrapper[4841]: E0313 09:12:48.075018 4841 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 09:12:48 crc kubenswrapper[4841]: I0313 09:12:48.922888 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:48Z is after 2026-02-23T05:33:13Z Mar 13 09:12:49 crc kubenswrapper[4841]: I0313 09:12:49.922780 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:49Z is after 2026-02-23T05:33:13Z Mar 13 09:12:50 crc kubenswrapper[4841]: I0313 09:12:50.921258 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:50Z is after 2026-02-23T05:33:13Z Mar 13 09:12:51 crc kubenswrapper[4841]: I0313 09:12:51.920874 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:51Z is after 2026-02-23T05:33:13Z Mar 13 09:12:52 crc kubenswrapper[4841]: E0313 09:12:52.039482 4841 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:52Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c5ba51832d479 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.910848633 +0000 UTC m=+0.640748864,LastTimestamp:2026-03-13 09:11:57.910848633 +0000 UTC m=+0.640748864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:12:52 crc kubenswrapper[4841]: I0313 09:12:52.922060 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:52Z is after 2026-02-23T05:33:13Z Mar 13 09:12:52 crc kubenswrapper[4841]: I0313 09:12:52.994556 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:52 crc kubenswrapper[4841]: I0313 09:12:52.996168 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:52 crc kubenswrapper[4841]: I0313 09:12:52.996216 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:52 crc kubenswrapper[4841]: I0313 09:12:52.996230 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:52 crc kubenswrapper[4841]: I0313 09:12:52.996942 4841 scope.go:117] "RemoveContainer" containerID="49595d479a1054602923200c6ced6058330b4e5b074ea8a6994405af9529f64d" Mar 13 09:12:52 crc kubenswrapper[4841]: E0313 09:12:52.997196 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:12:53 crc kubenswrapper[4841]: E0313 09:12:53.453685 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:53Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 09:12:53 crc kubenswrapper[4841]: I0313 09:12:53.459736 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:53 crc kubenswrapper[4841]: I0313 09:12:53.461176 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:53 crc kubenswrapper[4841]: I0313 09:12:53.461239 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:53 crc kubenswrapper[4841]: I0313 09:12:53.461261 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:53 crc kubenswrapper[4841]: I0313 09:12:53.461327 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 09:12:53 crc kubenswrapper[4841]: E0313 09:12:53.466333 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:53Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 09:12:53 crc kubenswrapper[4841]: I0313 09:12:53.920114 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:53Z is after 2026-02-23T05:33:13Z Mar 13 09:12:54 crc kubenswrapper[4841]: I0313 09:12:54.082742 4841 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 09:12:54 crc kubenswrapper[4841]: I0313 09:12:54.082844 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 09:12:54 crc kubenswrapper[4841]: I0313 09:12:54.082904 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:12:54 crc kubenswrapper[4841]: I0313 09:12:54.083038 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:54 crc kubenswrapper[4841]: I0313 09:12:54.084079 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:54 crc kubenswrapper[4841]: I0313 09:12:54.084104 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:54 crc kubenswrapper[4841]: I0313 09:12:54.084116 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:54 crc kubenswrapper[4841]: I0313 09:12:54.084552 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"b48a1313ef08ab3065fd803efd4e71b468dc9dc0c887af4c705397d07bc3444f"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 09:12:54 crc kubenswrapper[4841]: I0313 09:12:54.084646 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://b48a1313ef08ab3065fd803efd4e71b468dc9dc0c887af4c705397d07bc3444f" gracePeriod=30 Mar 13 09:12:54 crc kubenswrapper[4841]: I0313 09:12:54.285117 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 09:12:54 crc kubenswrapper[4841]: I0313 09:12:54.287018 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 09:12:54 crc kubenswrapper[4841]: I0313 09:12:54.287738 4841 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b48a1313ef08ab3065fd803efd4e71b468dc9dc0c887af4c705397d07bc3444f" exitCode=255 Mar 13 09:12:54 crc kubenswrapper[4841]: I0313 09:12:54.287822 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b48a1313ef08ab3065fd803efd4e71b468dc9dc0c887af4c705397d07bc3444f"} Mar 13 09:12:54 crc kubenswrapper[4841]: I0313 09:12:54.287919 4841 scope.go:117] "RemoveContainer" containerID="f529bc95349bfcff9cc46f3dd5510b44dd11e4916999c0bcb7367b6b7b2beece" Mar 13 09:12:54 crc kubenswrapper[4841]: I0313 09:12:54.923796 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:54Z is after 2026-02-23T05:33:13Z Mar 13 09:12:55 crc kubenswrapper[4841]: I0313 09:12:55.292112 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 09:12:55 crc kubenswrapper[4841]: I0313 09:12:55.293984 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3bb5d2ddd18b198fb4c473b94295b9755e14c4b010c63cb78bd3cec1c51c812a"} Mar 13 09:12:55 crc kubenswrapper[4841]: I0313 09:12:55.294088 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:55 crc kubenswrapper[4841]: I0313 09:12:55.295339 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:55 crc kubenswrapper[4841]: I0313 09:12:55.295374 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:55 crc kubenswrapper[4841]: I0313 09:12:55.295386 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:55 crc kubenswrapper[4841]: I0313 09:12:55.922651 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:55Z is after 2026-02-23T05:33:13Z Mar 13 09:12:56 crc kubenswrapper[4841]: I0313 09:12:56.297120 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:12:56 crc kubenswrapper[4841]: I0313 09:12:56.298473 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:12:56 crc kubenswrapper[4841]: I0313 09:12:56.298543 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:12:56 crc kubenswrapper[4841]: I0313 09:12:56.298565 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:12:56 crc kubenswrapper[4841]: I0313 09:12:56.922428 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:56Z is after 2026-02-23T05:33:13Z Mar 13 09:12:57 crc kubenswrapper[4841]: I0313 09:12:57.924189 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 09:12:58 crc kubenswrapper[4841]: E0313 09:12:58.075309 4841 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 09:12:58 crc kubenswrapper[4841]: I0313 09:12:58.922530 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 09:12:59 crc kubenswrapper[4841]: I0313 09:12:59.925094 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 09:13:00 crc kubenswrapper[4841]: E0313 09:13:00.458520 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 09:13:00 crc kubenswrapper[4841]: I0313 09:13:00.466676 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:00 crc kubenswrapper[4841]: I0313 09:13:00.467961 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:00 crc kubenswrapper[4841]: I0313 09:13:00.468090 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:00 crc kubenswrapper[4841]: I0313 09:13:00.468182 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:00 crc kubenswrapper[4841]: I0313 09:13:00.468305 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 09:13:00 crc kubenswrapper[4841]: E0313 09:13:00.471929 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 09:13:00 crc kubenswrapper[4841]: I0313 09:13:00.924506 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 09:13:01 crc kubenswrapper[4841]: I0313 09:13:01.081598 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:13:01 crc kubenswrapper[4841]: I0313 09:13:01.082819 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:01 crc kubenswrapper[4841]: I0313 09:13:01.084542 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:01 crc kubenswrapper[4841]: I0313 09:13:01.084602 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:01 crc kubenswrapper[4841]: I0313 09:13:01.084612 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:01 crc kubenswrapper[4841]: I0313 09:13:01.922730 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.045711 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51832d479 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.910848633 +0000 UTC m=+0.640748864,LastTimestamp:2026-03-13 09:11:57.910848633 +0000 UTC m=+0.640748864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.050734 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b13b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982692276 +0000 UTC m=+0.712592477,LastTimestamp:2026-03-13 09:11:57.982692276 +0000 UTC m=+0.712592477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.055938 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b6de7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982715367 +0000 UTC m=+0.712615568,LastTimestamp:2026-03-13 09:11:57.982715367 +0000 UTC m=+0.712615568,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.059649 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b9c96 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982727318 +0000 UTC m=+0.712627519,LastTimestamp:2026-03-13 09:11:57.982727318 +0000 UTC m=+0.712627519,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.064586 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba52134c9db default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:58.061971931 +0000 UTC m=+0.791872162,LastTimestamp:2026-03-13 09:11:58.061971931 +0000 UTC m=+0.791872162,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.070459 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b13b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b13b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982692276 +0000 UTC m=+0.712592477,LastTimestamp:2026-03-13 09:11:58.096367675 +0000 UTC m=+0.826267876,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.075857 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b6de7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b6de7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982715367 +0000 UTC m=+0.712615568,LastTimestamp:2026-03-13 09:11:58.096405417 +0000 UTC m=+0.826305618,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.079500 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b9c96\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b9c96 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982727318 +0000 UTC m=+0.712627519,LastTimestamp:2026-03-13 09:11:58.096420857 +0000 UTC m=+0.826321068,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.083404 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b13b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b13b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982692276 +0000 UTC m=+0.712592477,LastTimestamp:2026-03-13 09:11:58.097960945 +0000 UTC m=+0.827861176,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.087754 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b6de7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b6de7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982715367 +0000 UTC m=+0.712615568,LastTimestamp:2026-03-13 09:11:58.097997116 +0000 UTC m=+0.827897347,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.092183 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b13b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b13b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982692276 +0000 UTC m=+0.712592477,LastTimestamp:2026-03-13 09:11:58.098042028 +0000 UTC m=+0.827942259,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.096800 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b9c96\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b9c96 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982727318 +0000 UTC m=+0.712627519,LastTimestamp:2026-03-13 09:11:58.098066289 +0000 UTC m=+0.827966490,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.101152 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b6de7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b6de7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982715367 +0000 UTC m=+0.712615568,LastTimestamp:2026-03-13 09:11:58.098079109 +0000 UTC m=+0.827979350,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.105679 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b9c96\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b9c96 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982727318 +0000 UTC m=+0.712627519,LastTimestamp:2026-03-13 09:11:58.09809715 +0000 UTC m=+0.827997381,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.109783 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b13b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b13b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982692276 +0000 UTC m=+0.712592477,LastTimestamp:2026-03-13 09:11:58.099770091 +0000 UTC m=+0.829670292,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.114616 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b6de7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b6de7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982715367 +0000 UTC m=+0.712615568,LastTimestamp:2026-03-13 09:11:58.099785581 +0000 UTC m=+0.829685782,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.117003 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b13b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b13b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982692276 +0000 UTC m=+0.712592477,LastTimestamp:2026-03-13 09:11:58.099805962 +0000 UTC m=+0.829706193,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.120353 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b9c96\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b9c96 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982727318 +0000 UTC m=+0.712627519,LastTimestamp:2026-03-13 09:11:58.099828293 +0000 UTC m=+0.829728494,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.123911 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b6de7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b6de7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982715367 +0000 UTC m=+0.712615568,LastTimestamp:2026-03-13 09:11:58.099841773 +0000 UTC m=+0.829742004,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.127111 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b9c96\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b9c96 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982727318 +0000 UTC m=+0.712627519,LastTimestamp:2026-03-13 09:11:58.099865194 +0000 UTC m=+0.829765425,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.130444 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b13b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b13b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982692276 +0000 UTC m=+0.712592477,LastTimestamp:2026-03-13 09:11:58.101462354 +0000 UTC m=+0.831362555,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.133336 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b6de7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b6de7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982715367 +0000 UTC m=+0.712615568,LastTimestamp:2026-03-13 09:11:58.101483225 +0000 UTC m=+0.831383426,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.136421 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b13b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b13b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982692276 +0000 UTC m=+0.712592477,LastTimestamp:2026-03-13 09:11:58.101498755 +0000 UTC m=+0.831398986,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.139544 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b6de7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b6de7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982715367 +0000 UTC m=+0.712615568,LastTimestamp:2026-03-13 09:11:58.101525416 +0000 UTC m=+0.831425647,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.142653 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c5ba51c7b9c96\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c5ba51c7b9c96 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:57.982727318 +0000 UTC m=+0.712627519,LastTimestamp:2026-03-13 09:11:58.101544536 +0000 UTC m=+0.831444767,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.149119 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba53b343889 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:58.498142345 +0000 UTC m=+1.228042566,LastTimestamp:2026-03-13 09:11:58.498142345 +0000 UTC m=+1.228042566,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.155347 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c5ba53b39f839 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:58.498519097 +0000 UTC m=+1.228419328,LastTimestamp:2026-03-13 09:11:58.498519097 +0000 UTC m=+1.228419328,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.161383 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c5ba53b7511de openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:58.502392286 +0000 UTC m=+1.232292517,LastTimestamp:2026-03-13 09:11:58.502392286 +0000 UTC m=+1.232292517,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.169142 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba53be7d1e5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:58.509912549 +0000 UTC m=+1.239812740,LastTimestamp:2026-03-13 09:11:58.509912549 +0000 UTC m=+1.239812740,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.174372 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba53c262c4d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:58.513998925 +0000 UTC m=+1.243899136,LastTimestamp:2026-03-13 09:11:58.513998925 +0000 UTC m=+1.243899136,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.179444 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c5ba55e8b1443 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.091037251 +0000 UTC m=+1.820937442,LastTimestamp:2026-03-13 09:11:59.091037251 +0000 UTC m=+1.820937442,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.183443 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba55e8c43a1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.091114913 +0000 UTC m=+1.821015134,LastTimestamp:2026-03-13 09:11:59.091114913 +0000 UTC m=+1.821015134,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.189433 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba55e8ee35e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.091286878 +0000 UTC m=+1.821187099,LastTimestamp:2026-03-13 09:11:59.091286878 +0000 UTC m=+1.821187099,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.195451 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c5ba55f1bac7c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.100513404 +0000 UTC m=+1.830413635,LastTimestamp:2026-03-13 09:11:59.100513404 +0000 UTC m=+1.830413635,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.199914 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba55f38b0c3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.102415043 +0000 UTC m=+1.832315264,LastTimestamp:2026-03-13 09:11:59.102415043 +0000 UTC m=+1.832315264,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.206479 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba55f6505dd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.105320413 +0000 UTC m=+1.835220634,LastTimestamp:2026-03-13 09:11:59.105320413 +0000 UTC m=+1.835220634,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.211312 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba55f7fd835 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.107078197 +0000 UTC m=+1.836978388,LastTimestamp:2026-03-13 09:11:59.107078197 +0000 UTC m=+1.836978388,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.217298 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c5ba55f8dee1a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.108001306 +0000 UTC m=+1.837901527,LastTimestamp:2026-03-13 09:11:59.108001306 +0000 UTC m=+1.837901527,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.230658 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba55f9125fe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.108212222 +0000 UTC m=+1.838112413,LastTimestamp:2026-03-13 09:11:59.108212222 +0000 UTC m=+1.838112413,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.235174 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c5ba56003fd9d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.115738525 +0000 UTC m=+1.845638746,LastTimestamp:2026-03-13 09:11:59.115738525 +0000 UTC m=+1.845638746,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.239623 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba5606f42db openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.122768603 +0000 UTC m=+1.852668794,LastTimestamp:2026-03-13 09:11:59.122768603 +0000 UTC m=+1.852668794,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.244855 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba572211186 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.419634054 +0000 UTC m=+2.149534245,LastTimestamp:2026-03-13 09:11:59.419634054 +0000 UTC m=+2.149534245,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.251358 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba572c5e822 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.430436898 +0000 UTC m=+2.160337129,LastTimestamp:2026-03-13 09:11:59.430436898 +0000 UTC m=+2.160337129,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.257673 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba572de8422 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.432049698 +0000 UTC m=+2.161949889,LastTimestamp:2026-03-13 09:11:59.432049698 +0000 UTC m=+2.161949889,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.264230 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba5814429a2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.673592226 +0000 UTC m=+2.403492447,LastTimestamp:2026-03-13 09:11:59.673592226 +0000 UTC m=+2.403492447,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.270631 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba582a7d90d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.696902413 +0000 UTC m=+2.426802614,LastTimestamp:2026-03-13 09:11:59.696902413 +0000 UTC m=+2.426802614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.276931 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba582c198c7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.698589895 +0000 UTC m=+2.428490096,LastTimestamp:2026-03-13 09:11:59.698589895 +0000 UTC m=+2.428490096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.283611 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba590e9e752 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.936112466 +0000 UTC m=+2.666012657,LastTimestamp:2026-03-13 09:11:59.936112466 +0000 UTC m=+2.666012657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.289663 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba591b86c32 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.949646898 +0000 UTC m=+2.679547089,LastTimestamp:2026-03-13 09:11:59.949646898 +0000 UTC m=+2.679547089,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.295150 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba59661ae99 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.027848345 +0000 UTC m=+2.757748556,LastTimestamp:2026-03-13 09:12:00.027848345 +0000 UTC m=+2.757748556,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.301332 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba596692944 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.0283385 +0000 UTC m=+2.758238691,LastTimestamp:2026-03-13 09:12:00.0283385 +0000 UTC m=+2.758238691,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.305840 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c5ba5976b5779 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.045258617 +0000 UTC m=+2.775158818,LastTimestamp:2026-03-13 09:12:00.045258617 +0000 UTC m=+2.775158818,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.309585 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c5ba5977508b9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.045893817 +0000 UTC m=+2.775794018,LastTimestamp:2026-03-13 09:12:00.045893817 +0000 UTC m=+2.775794018,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.316692 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba5a6986803 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.299870211 +0000 UTC m=+3.029770402,LastTimestamp:2026-03-13 09:12:00.299870211 +0000 UTC m=+3.029770402,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.320674 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba5a6f04893 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.305629331 +0000 UTC m=+3.035529512,LastTimestamp:2026-03-13 09:12:00.305629331 +0000 UTC m=+3.035529512,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.324766 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c5ba5a6f5a5be openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.305980862 +0000 UTC m=+3.035881053,LastTimestamp:2026-03-13 09:12:00.305980862 +0000 UTC m=+3.035881053,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.328682 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c5ba5a70268fe openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.306817278 +0000 UTC m=+3.036717469,LastTimestamp:2026-03-13 09:12:00.306817278 +0000 UTC m=+3.036717469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.332552 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba5a7705bf6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.314022902 +0000 UTC m=+3.043923103,LastTimestamp:2026-03-13 09:12:00.314022902 +0000 UTC m=+3.043923103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.337382 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba5a7826e98 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.31520732 +0000 UTC m=+3.045107521,LastTimestamp:2026-03-13 09:12:00.31520732 +0000 UTC m=+3.045107521,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.341236 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c5ba5a79b51a6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.31683831 +0000 UTC m=+3.046738491,LastTimestamp:2026-03-13 09:12:00.31683831 +0000 UTC m=+3.046738491,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.347009 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c5ba5a7aca75e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.317974366 +0000 UTC m=+3.047874557,LastTimestamp:2026-03-13 09:12:00.317974366 +0000 UTC m=+3.047874557,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.352747 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c5ba5a8b27a6b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.335133291 +0000 UTC m=+3.065033482,LastTimestamp:2026-03-13 09:12:00.335133291 +0000 UTC m=+3.065033482,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.356401 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba5a8fbda0c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.3399419 +0000 UTC m=+3.069842111,LastTimestamp:2026-03-13 09:12:00.3399419 +0000 UTC m=+3.069842111,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.361669 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c5ba5b509704a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.542158922 +0000 UTC m=+3.272059153,LastTimestamp:2026-03-13 09:12:00.542158922 +0000 UTC m=+3.272059153,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.367207 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba5b520b1c0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.543683008 +0000 UTC m=+3.273583199,LastTimestamp:2026-03-13 09:12:00.543683008 +0000 UTC m=+3.273583199,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.371120 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c5ba5b5daf509 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.555889929 +0000 UTC m=+3.285790150,LastTimestamp:2026-03-13 09:12:00.555889929 +0000 UTC m=+3.285790150,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.374618 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c5ba5b5ebe33b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.556999483 +0000 UTC m=+3.286899674,LastTimestamp:2026-03-13 09:12:00.556999483 +0000 UTC m=+3.286899674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.378502 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba5b623e6be openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.560670398 +0000 UTC m=+3.290570589,LastTimestamp:2026-03-13 09:12:00.560670398 +0000 UTC m=+3.290570589,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.382994 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba5b6522d34 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.563703092 +0000 UTC m=+3.293603283,LastTimestamp:2026-03-13 09:12:00.563703092 +0000 UTC m=+3.293603283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.387171 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c5ba5c154d660 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.748426848 +0000 UTC m=+3.478327039,LastTimestamp:2026-03-13 09:12:00.748426848 +0000 UTC m=+3.478327039,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.391714 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba5c1718881 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.750307457 +0000 UTC m=+3.480207648,LastTimestamp:2026-03-13 09:12:00.750307457 +0000 UTC m=+3.480207648,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.397353 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c5ba5c1e5ab5b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.757918555 +0000 UTC m=+3.487818746,LastTimestamp:2026-03-13 09:12:00.757918555 +0000 UTC m=+3.487818746,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.402360 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba5c24508e3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.764168419 +0000 UTC m=+3.494068610,LastTimestamp:2026-03-13 09:12:00.764168419 +0000 UTC m=+3.494068610,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.406347 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba5c250f5cb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.764949963 +0000 UTC m=+3.494850154,LastTimestamp:2026-03-13 09:12:00.764949963 +0000 UTC m=+3.494850154,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.411807 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba5cce8f5aa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.942683562 +0000 UTC m=+3.672583763,LastTimestamp:2026-03-13 09:12:00.942683562 +0000 UTC m=+3.672583763,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.416216 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba5cd743966 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.951810406 +0000 UTC m=+3.681710597,LastTimestamp:2026-03-13 09:12:00.951810406 +0000 UTC m=+3.681710597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.419616 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba5cd861ec1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.952983233 +0000 UTC m=+3.682883424,LastTimestamp:2026-03-13 09:12:00.952983233 +0000 UTC m=+3.682883424,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.424932 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba5d34b2130 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:01.049780528 +0000 UTC m=+3.779680739,LastTimestamp:2026-03-13 09:12:01.049780528 +0000 UTC m=+3.779680739,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.430358 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba5dad0466c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:01.17594686 +0000 UTC m=+3.905847071,LastTimestamp:2026-03-13 09:12:01.17594686 +0000 UTC m=+3.905847071,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.434159 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba5db86c205 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:01.187906053 +0000 UTC m=+3.917806254,LastTimestamp:2026-03-13 09:12:01.187906053 +0000 UTC m=+3.917806254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.437341 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba5e1d752ce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:01.293849294 +0000 UTC m=+4.023749495,LastTimestamp:2026-03-13 09:12:01.293849294 +0000 UTC m=+4.023749495,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.439279 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba5e28db804 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:01.305802756 +0000 UTC m=+4.035702967,LastTimestamp:2026-03-13 09:12:01.305802756 +0000 UTC m=+4.035702967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.443478 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba611277c6b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:02.087631979 +0000 UTC m=+4.817532190,LastTimestamp:2026-03-13 09:12:02.087631979 +0000 UTC m=+4.817532190,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.448952 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba61e6dbe98 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:02.310340248 +0000 UTC m=+5.040240479,LastTimestamp:2026-03-13 09:12:02.310340248 +0000 UTC m=+5.040240479,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.452986 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba61f3272e4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:02.32323146 +0000 UTC m=+5.053131681,LastTimestamp:2026-03-13 09:12:02.32323146 +0000 UTC m=+5.053131681,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.456563 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba61f50cd72 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:02.325220722 +0000 UTC m=+5.055120923,LastTimestamp:2026-03-13 09:12:02.325220722 +0000 UTC m=+5.055120923,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.460303 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba62e8b24f5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:02.580702453 +0000 UTC m=+5.310602674,LastTimestamp:2026-03-13 09:12:02.580702453 +0000 UTC m=+5.310602674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.464184 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba62f4db080 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:02.59345216 +0000 UTC m=+5.323352391,LastTimestamp:2026-03-13 09:12:02.59345216 +0000 UTC m=+5.323352391,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.468057 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba62f664db6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:02.59506527 +0000 UTC m=+5.324965501,LastTimestamp:2026-03-13 09:12:02.59506527 +0000 UTC m=+5.324965501,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.472139 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba63bc295a7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:02.802439591 +0000 UTC m=+5.532339822,LastTimestamp:2026-03-13 09:12:02.802439591 +0000 UTC m=+5.532339822,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.475736 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba63cbc0ebc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:02.818789052 +0000 UTC m=+5.548689283,LastTimestamp:2026-03-13 09:12:02.818789052 +0000 UTC m=+5.548689283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.479302 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba63cd178bd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:02.820192445 +0000 UTC m=+5.550092676,LastTimestamp:2026-03-13 09:12:02.820192445 +0000 UTC m=+5.550092676,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.482400 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba64b2fdfaf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:03.061260207 +0000 UTC m=+5.791160398,LastTimestamp:2026-03-13 09:12:03.061260207 +0000 UTC m=+5.791160398,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.486433 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba64bdf4d17 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:03.072757015 +0000 UTC m=+5.802657236,LastTimestamp:2026-03-13 09:12:03.072757015 +0000 UTC m=+5.802657236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.492180 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba64bf4706e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:03.074142318 +0000 UTC m=+5.804042519,LastTimestamp:2026-03-13 09:12:03.074142318 +0000 UTC m=+5.804042519,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.496573 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba65a3dc507 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:03.313829127 +0000 UTC m=+6.043729318,LastTimestamp:2026-03-13 09:12:03.313829127 +0000 UTC m=+6.043729318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.500298 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c5ba65b729fd3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:03.334070227 +0000 UTC m=+6.063970468,LastTimestamp:2026-03-13 09:12:03.334070227 +0000 UTC m=+6.063970468,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.504643 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 09:13:02 crc kubenswrapper[4841]: &Event{ObjectMeta:{kube-controller-manager-crc.189c5ba687fe37a9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 13 09:13:02 crc kubenswrapper[4841]: body: Mar 13 09:13:02 crc kubenswrapper[4841]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:04.081416105 +0000 UTC m=+6.811316296,LastTimestamp:2026-03-13 09:12:04.081416105 +0000 UTC m=+6.811316296,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 09:13:02 crc kubenswrapper[4841]: > Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.507998 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba687ff135b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:04.081472347 +0000 UTC m=+6.811372538,LastTimestamp:2026-03-13 09:12:04.081472347 +0000 UTC m=+6.811372538,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.514114 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 09:13:02 crc kubenswrapper[4841]: &Event{ObjectMeta:{kube-apiserver-crc.189c5ba860ac3976 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 13 09:13:02 crc kubenswrapper[4841]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 09:13:02 crc kubenswrapper[4841]: Mar 13 09:13:02 crc kubenswrapper[4841]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:12.011665782 +0000 UTC m=+14.741566003,LastTimestamp:2026-03-13 09:12:12.011665782 +0000 UTC m=+14.741566003,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 09:13:02 crc kubenswrapper[4841]: > Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.517167 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba860ad0bf0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:12.011719664 +0000 UTC m=+14.741619885,LastTimestamp:2026-03-13 09:12:12.011719664 +0000 UTC m=+14.741619885,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.520992 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c5ba860ac3976\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 09:13:02 crc kubenswrapper[4841]: &Event{ObjectMeta:{kube-apiserver-crc.189c5ba860ac3976 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 13 09:13:02 crc kubenswrapper[4841]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 09:13:02 crc kubenswrapper[4841]: Mar 13 09:13:02 crc kubenswrapper[4841]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:12.011665782 +0000 UTC m=+14.741566003,LastTimestamp:2026-03-13 09:12:12.021595791 +0000 UTC m=+14.751496002,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 09:13:02 crc kubenswrapper[4841]: > Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.524729 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c5ba860ad0bf0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba860ad0bf0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:12.011719664 +0000 UTC m=+14.741619885,LastTimestamp:2026-03-13 09:12:12.021642883 +0000 UTC m=+14.751543084,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.528540 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c5ba5cd861ec1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba5cd861ec1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:00.952983233 +0000 UTC m=+3.682883424,LastTimestamp:2026-03-13 09:12:12.137926707 +0000 UTC m=+14.867826898,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.532441 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c5ba5dad0466c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba5dad0466c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:01.17594686 +0000 UTC m=+3.905847071,LastTimestamp:2026-03-13 09:12:12.331771017 +0000 UTC m=+15.061671218,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.536726 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c5ba5db86c205\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c5ba5db86c205 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:01.187906053 +0000 UTC m=+3.917806254,LastTimestamp:2026-03-13 09:12:12.342221152 +0000 UTC m=+15.072121363,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.542237 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 09:13:02 crc kubenswrapper[4841]: &Event{ObjectMeta:{kube-controller-manager-crc.189c5ba8dc199dfb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 09:13:02 crc kubenswrapper[4841]: body: Mar 13 09:13:02 crc kubenswrapper[4841]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:14.082432507 +0000 UTC m=+16.812332748,LastTimestamp:2026-03-13 09:12:14.082432507 +0000 UTC m=+16.812332748,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 09:13:02 crc kubenswrapper[4841]: > Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.546156 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba8dc1ab0cc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:14.08250286 +0000 UTC m=+16.812403091,LastTimestamp:2026-03-13 09:12:14.08250286 +0000 UTC m=+16.812403091,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.553373 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c5ba8dc199dfb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 09:13:02 crc kubenswrapper[4841]: &Event{ObjectMeta:{kube-controller-manager-crc.189c5ba8dc199dfb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 09:13:02 crc kubenswrapper[4841]: body: Mar 13 09:13:02 crc kubenswrapper[4841]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:14.082432507 +0000 UTC m=+16.812332748,LastTimestamp:2026-03-13 09:12:24.082447358 +0000 UTC m=+26.812347579,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 09:13:02 crc kubenswrapper[4841]: > Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.559219 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c5ba8dc1ab0cc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba8dc1ab0cc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:14.08250286 +0000 UTC m=+16.812403091,LastTimestamp:2026-03-13 09:12:24.082503579 +0000 UTC m=+26.812403810,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.563953 4841 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5bab3053634f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:24.085439311 +0000 UTC m=+26.815339562,LastTimestamp:2026-03-13 09:12:24.085439311 +0000 UTC m=+26.815339562,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.568187 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c5ba55f7fd835\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba55f7fd835 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.107078197 +0000 UTC m=+1.836978388,LastTimestamp:2026-03-13 09:12:24.206483493 +0000 UTC m=+26.936383714,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.574468 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c5ba572211186\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba572211186 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.419634054 +0000 UTC m=+2.149534245,LastTimestamp:2026-03-13 09:12:24.411091409 +0000 UTC m=+27.140991600,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.580056 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c5ba572c5e822\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba572c5e822 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:11:59.430436898 +0000 UTC m=+2.160337129,LastTimestamp:2026-03-13 09:12:24.421938816 +0000 UTC m=+27.151839037,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.587779 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c5ba8dc199dfb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 09:13:02 crc kubenswrapper[4841]: &Event{ObjectMeta:{kube-controller-manager-crc.189c5ba8dc199dfb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 09:13:02 crc kubenswrapper[4841]: body: Mar 13 09:13:02 crc kubenswrapper[4841]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:14.082432507 +0000 UTC m=+16.812332748,LastTimestamp:2026-03-13 09:12:34.082235754 +0000 UTC m=+36.812135975,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 09:13:02 crc kubenswrapper[4841]: > Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.594537 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c5ba8dc1ab0cc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c5ba8dc1ab0cc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:14.08250286 +0000 UTC m=+16.812403091,LastTimestamp:2026-03-13 09:12:34.082382709 +0000 UTC m=+36.812282930,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:13:02 crc kubenswrapper[4841]: E0313 09:13:02.601405 4841 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c5ba8dc199dfb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 09:13:02 crc kubenswrapper[4841]: &Event{ObjectMeta:{kube-controller-manager-crc.189c5ba8dc199dfb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 09:13:02 crc kubenswrapper[4841]: body: Mar 13 09:13:02 crc kubenswrapper[4841]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:12:14.082432507 +0000 UTC m=+16.812332748,LastTimestamp:2026-03-13 09:12:44.088681967 +0000 UTC m=+46.818582198,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 09:13:02 crc kubenswrapper[4841]: > Mar 13 09:13:02 crc kubenswrapper[4841]: I0313 09:13:02.921763 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 09:13:04 crc kubenswrapper[4841]: I0313 09:13:03.922453 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 09:13:04 crc kubenswrapper[4841]: I0313 09:13:04.081980 4841 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 09:13:04 crc kubenswrapper[4841]: I0313 09:13:04.082061 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 09:13:04 crc kubenswrapper[4841]: I0313 09:13:04.922859 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 09:13:05 crc kubenswrapper[4841]: I0313 09:13:05.229641 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:13:05 crc kubenswrapper[4841]: I0313 09:13:05.229808 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:05 crc kubenswrapper[4841]: I0313 09:13:05.230995 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:05 crc kubenswrapper[4841]: I0313 09:13:05.231064 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:05 crc kubenswrapper[4841]: I0313 09:13:05.231076 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:05 crc kubenswrapper[4841]: I0313 09:13:05.923175 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 09:13:05 crc kubenswrapper[4841]: I0313 09:13:05.994942 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:05 crc kubenswrapper[4841]: I0313 09:13:05.996384 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:05 crc kubenswrapper[4841]: I0313 09:13:05.996424 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:05 crc kubenswrapper[4841]: I0313 09:13:05.996435 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:05 crc kubenswrapper[4841]: I0313 09:13:05.996991 4841 scope.go:117] "RemoveContainer" containerID="49595d479a1054602923200c6ced6058330b4e5b074ea8a6994405af9529f64d" Mar 13 09:13:06 crc kubenswrapper[4841]: I0313 09:13:06.824842 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 09:13:06 crc kubenswrapper[4841]: I0313 09:13:06.828142 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9dcf7400db6da22bb5b4994f992f3e3574479c07ff50b2e4bf4988f4041476d5"} Mar 13 09:13:06 crc kubenswrapper[4841]: I0313 09:13:06.828352 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:06 crc kubenswrapper[4841]: I0313 09:13:06.829950 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:06 crc kubenswrapper[4841]: I0313 09:13:06.829985 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:06 crc kubenswrapper[4841]: I0313 09:13:06.829996 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:06 crc kubenswrapper[4841]: I0313 09:13:06.924736 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 09:13:07 crc kubenswrapper[4841]: E0313 09:13:07.463502 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 09:13:07 crc kubenswrapper[4841]: I0313 09:13:07.472651 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:07 crc kubenswrapper[4841]: I0313 09:13:07.473852 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:07 crc kubenswrapper[4841]: I0313 09:13:07.473894 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:07 crc kubenswrapper[4841]: I0313 09:13:07.473904 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:07 crc kubenswrapper[4841]: I0313 09:13:07.473930 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 09:13:07 crc kubenswrapper[4841]: E0313 09:13:07.477960 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 09:13:07 crc kubenswrapper[4841]: I0313 09:13:07.831400 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 09:13:07 crc kubenswrapper[4841]: I0313 09:13:07.831972 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 09:13:07 crc kubenswrapper[4841]: I0313 09:13:07.833419 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9dcf7400db6da22bb5b4994f992f3e3574479c07ff50b2e4bf4988f4041476d5" exitCode=255 Mar 13 09:13:07 crc kubenswrapper[4841]: I0313 09:13:07.833450 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9dcf7400db6da22bb5b4994f992f3e3574479c07ff50b2e4bf4988f4041476d5"} Mar 13 09:13:07 crc kubenswrapper[4841]: I0313 09:13:07.833501 4841 scope.go:117] "RemoveContainer" containerID="49595d479a1054602923200c6ced6058330b4e5b074ea8a6994405af9529f64d" Mar 13 09:13:07 crc kubenswrapper[4841]: I0313 09:13:07.833692 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:07 crc kubenswrapper[4841]: I0313 09:13:07.834719 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:07 crc kubenswrapper[4841]: I0313 09:13:07.834749 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:07 crc kubenswrapper[4841]: I0313 09:13:07.834758 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:07 crc kubenswrapper[4841]: I0313 09:13:07.835237 4841 scope.go:117] "RemoveContainer" containerID="9dcf7400db6da22bb5b4994f992f3e3574479c07ff50b2e4bf4988f4041476d5" Mar 13 09:13:07 crc kubenswrapper[4841]: E0313 09:13:07.835414 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:13:07 crc kubenswrapper[4841]: I0313 09:13:07.921844 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 09:13:08 crc kubenswrapper[4841]: E0313 09:13:08.075710 4841 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 09:13:08 crc kubenswrapper[4841]: I0313 09:13:08.838324 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 09:13:08 crc kubenswrapper[4841]: I0313 09:13:08.926285 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 09:13:09 crc kubenswrapper[4841]: I0313 09:13:09.162089 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 09:13:09 crc kubenswrapper[4841]: I0313 09:13:09.175686 4841 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 09:13:09 crc kubenswrapper[4841]: W0313 09:13:09.509308 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 13 09:13:09 crc kubenswrapper[4841]: E0313 09:13:09.509660 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 09:13:09 crc kubenswrapper[4841]: I0313 09:13:09.921727 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 09:13:10 crc kubenswrapper[4841]: I0313 09:13:10.268557 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:13:10 crc kubenswrapper[4841]: I0313 09:13:10.268854 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:10 crc kubenswrapper[4841]: I0313 09:13:10.270251 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:10 crc kubenswrapper[4841]: I0313 09:13:10.270330 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:10 crc kubenswrapper[4841]: I0313 09:13:10.270346 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:10 crc kubenswrapper[4841]: I0313 09:13:10.270931 4841 scope.go:117] "RemoveContainer" containerID="9dcf7400db6da22bb5b4994f992f3e3574479c07ff50b2e4bf4988f4041476d5" Mar 13 09:13:10 crc kubenswrapper[4841]: E0313 09:13:10.271129 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:13:10 crc kubenswrapper[4841]: I0313 09:13:10.924522 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 09:13:11 crc kubenswrapper[4841]: I0313 09:13:11.090631 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:13:11 crc kubenswrapper[4841]: I0313 09:13:11.090837 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:11 crc kubenswrapper[4841]: I0313 09:13:11.092464 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:11 crc kubenswrapper[4841]: I0313 09:13:11.092533 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:11 crc kubenswrapper[4841]: I0313 09:13:11.092561 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:11 crc kubenswrapper[4841]: I0313 09:13:11.094705 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:13:11 crc kubenswrapper[4841]: I0313 09:13:11.848397 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:11 crc kubenswrapper[4841]: I0313 09:13:11.849594 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:11 crc kubenswrapper[4841]: I0313 09:13:11.849652 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:11 crc kubenswrapper[4841]: I0313 09:13:11.849675 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:11 crc kubenswrapper[4841]: I0313 09:13:11.922656 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 09:13:12 crc kubenswrapper[4841]: I0313 09:13:12.923362 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 09:13:13 crc kubenswrapper[4841]: I0313 09:13:13.020553 4841 csr.go:261] certificate signing request csr-82n88 is approved, waiting to be issued Mar 13 09:13:13 crc kubenswrapper[4841]: I0313 09:13:13.029373 4841 csr.go:257] certificate signing request csr-82n88 is issued Mar 13 09:13:13 crc kubenswrapper[4841]: I0313 09:13:13.100645 4841 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 13 09:13:13 crc kubenswrapper[4841]: I0313 09:13:13.759924 4841 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.031793 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-12 20:19:15.716138021 +0000 UTC Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.031852 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6587h6m1.68429099s for next certificate rotation Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.478534 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.481738 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.481806 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.481832 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.482207 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.494129 4841 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.494654 4841 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 13 09:13:14 crc kubenswrapper[4841]: E0313 09:13:14.494698 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.500062 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.500124 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.500147 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.500237 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.500258 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:14Z","lastTransitionTime":"2026-03-13T09:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:14 crc kubenswrapper[4841]: E0313 09:13:14.520164 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.530947 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.531024 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.531053 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.531086 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.531113 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:14Z","lastTransitionTime":"2026-03-13T09:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:14 crc kubenswrapper[4841]: E0313 09:13:14.549108 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.560775 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.560834 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.560858 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.560889 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.560918 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:14Z","lastTransitionTime":"2026-03-13T09:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:14 crc kubenswrapper[4841]: E0313 09:13:14.577415 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.588036 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.588089 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.588106 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.588124 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.588143 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:14Z","lastTransitionTime":"2026-03-13T09:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:14 crc kubenswrapper[4841]: E0313 09:13:14.603018 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:14 crc kubenswrapper[4841]: E0313 09:13:14.603259 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 09:13:14 crc kubenswrapper[4841]: E0313 09:13:14.603323 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:14 crc kubenswrapper[4841]: E0313 09:13:14.704429 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:14 crc kubenswrapper[4841]: E0313 09:13:14.805587 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:14 crc kubenswrapper[4841]: E0313 09:13:14.906024 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.994691 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.996451 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.996541 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:14 crc kubenswrapper[4841]: I0313 09:13:14.996560 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:15 crc kubenswrapper[4841]: E0313 09:13:15.006729 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:15 crc kubenswrapper[4841]: E0313 09:13:15.106853 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:15 crc kubenswrapper[4841]: E0313 09:13:15.207666 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:15 crc kubenswrapper[4841]: E0313 09:13:15.308663 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:15 crc kubenswrapper[4841]: E0313 09:13:15.409060 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:15 crc kubenswrapper[4841]: I0313 09:13:15.481478 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:13:15 crc kubenswrapper[4841]: I0313 09:13:15.481655 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:15 crc kubenswrapper[4841]: I0313 09:13:15.482838 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:15 crc kubenswrapper[4841]: I0313 09:13:15.482900 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:15 crc kubenswrapper[4841]: I0313 09:13:15.482912 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:15 crc kubenswrapper[4841]: I0313 09:13:15.483549 4841 scope.go:117] "RemoveContainer" containerID="9dcf7400db6da22bb5b4994f992f3e3574479c07ff50b2e4bf4988f4041476d5" Mar 13 09:13:15 crc kubenswrapper[4841]: E0313 09:13:15.483753 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:13:15 crc kubenswrapper[4841]: E0313 09:13:15.509297 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:15 crc kubenswrapper[4841]: E0313 09:13:15.610110 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:15 crc kubenswrapper[4841]: E0313 09:13:15.710829 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:15 crc kubenswrapper[4841]: E0313 09:13:15.811952 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:15 crc kubenswrapper[4841]: E0313 09:13:15.913175 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:16 crc kubenswrapper[4841]: E0313 09:13:16.013926 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:16 crc kubenswrapper[4841]: E0313 09:13:16.114984 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:16 crc kubenswrapper[4841]: E0313 09:13:16.215975 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:16 crc kubenswrapper[4841]: E0313 09:13:16.316836 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:16 crc kubenswrapper[4841]: E0313 09:13:16.418028 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:16 crc kubenswrapper[4841]: E0313 09:13:16.519181 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:16 crc kubenswrapper[4841]: E0313 09:13:16.619730 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:16 crc kubenswrapper[4841]: E0313 09:13:16.720840 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:16 crc kubenswrapper[4841]: E0313 09:13:16.821256 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:16 crc kubenswrapper[4841]: E0313 09:13:16.922410 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:17 crc kubenswrapper[4841]: E0313 09:13:17.023035 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:17 crc kubenswrapper[4841]: E0313 09:13:17.124227 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:17 crc kubenswrapper[4841]: E0313 09:13:17.224981 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:17 crc kubenswrapper[4841]: E0313 09:13:17.325333 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:17 crc kubenswrapper[4841]: E0313 09:13:17.425540 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:17 crc kubenswrapper[4841]: E0313 09:13:17.526057 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:17 crc kubenswrapper[4841]: E0313 09:13:17.626975 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:17 crc kubenswrapper[4841]: E0313 09:13:17.728073 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:17 crc kubenswrapper[4841]: E0313 09:13:17.828833 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:17 crc kubenswrapper[4841]: E0313 09:13:17.930449 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:18 crc kubenswrapper[4841]: E0313 09:13:18.030854 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:18 crc kubenswrapper[4841]: E0313 09:13:18.077015 4841 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 09:13:18 crc kubenswrapper[4841]: E0313 09:13:18.131037 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:18 crc kubenswrapper[4841]: E0313 09:13:18.231988 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:18 crc kubenswrapper[4841]: I0313 09:13:18.234310 4841 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 09:13:18 crc kubenswrapper[4841]: E0313 09:13:18.332932 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:18 crc kubenswrapper[4841]: E0313 09:13:18.433154 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:18 crc kubenswrapper[4841]: E0313 09:13:18.533750 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:18 crc kubenswrapper[4841]: E0313 09:13:18.634810 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:18 crc kubenswrapper[4841]: E0313 09:13:18.735927 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:18 crc kubenswrapper[4841]: E0313 09:13:18.836971 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:18 crc kubenswrapper[4841]: E0313 09:13:18.937820 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:19 crc kubenswrapper[4841]: E0313 09:13:19.038338 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:19 crc kubenswrapper[4841]: E0313 09:13:19.138587 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:19 crc kubenswrapper[4841]: E0313 09:13:19.239414 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:19 crc kubenswrapper[4841]: E0313 09:13:19.340584 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:19 crc kubenswrapper[4841]: E0313 09:13:19.441684 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:19 crc kubenswrapper[4841]: E0313 09:13:19.542375 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:19 crc kubenswrapper[4841]: E0313 09:13:19.643486 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:19 crc kubenswrapper[4841]: E0313 09:13:19.743605 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:19 crc kubenswrapper[4841]: E0313 09:13:19.844160 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:19 crc kubenswrapper[4841]: E0313 09:13:19.944851 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:20 crc kubenswrapper[4841]: E0313 09:13:20.045780 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:20 crc kubenswrapper[4841]: E0313 09:13:20.146938 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:20 crc kubenswrapper[4841]: E0313 09:13:20.247833 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:20 crc kubenswrapper[4841]: E0313 09:13:20.348685 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:20 crc kubenswrapper[4841]: E0313 09:13:20.449321 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:20 crc kubenswrapper[4841]: E0313 09:13:20.550231 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:20 crc kubenswrapper[4841]: E0313 09:13:20.650405 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:20 crc kubenswrapper[4841]: E0313 09:13:20.751328 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:20 crc kubenswrapper[4841]: E0313 09:13:20.852374 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:20 crc kubenswrapper[4841]: E0313 09:13:20.953152 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:21 crc kubenswrapper[4841]: E0313 09:13:21.053805 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:21 crc kubenswrapper[4841]: E0313 09:13:21.154204 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:21 crc kubenswrapper[4841]: E0313 09:13:21.255073 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:21 crc kubenswrapper[4841]: E0313 09:13:21.356083 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:21 crc kubenswrapper[4841]: E0313 09:13:21.456348 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:21 crc kubenswrapper[4841]: E0313 09:13:21.556924 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:21 crc kubenswrapper[4841]: E0313 09:13:21.657114 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:21 crc kubenswrapper[4841]: E0313 09:13:21.757968 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:21 crc kubenswrapper[4841]: E0313 09:13:21.859061 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:21 crc kubenswrapper[4841]: E0313 09:13:21.959765 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:22 crc kubenswrapper[4841]: E0313 09:13:22.060628 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:22 crc kubenswrapper[4841]: E0313 09:13:22.161728 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:22 crc kubenswrapper[4841]: E0313 09:13:22.262816 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:22 crc kubenswrapper[4841]: E0313 09:13:22.363354 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:22 crc kubenswrapper[4841]: E0313 09:13:22.463573 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:22 crc kubenswrapper[4841]: E0313 09:13:22.563982 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:22 crc kubenswrapper[4841]: E0313 09:13:22.665001 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:22 crc kubenswrapper[4841]: E0313 09:13:22.765852 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:22 crc kubenswrapper[4841]: E0313 09:13:22.866059 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:22 crc kubenswrapper[4841]: E0313 09:13:22.966953 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:23 crc kubenswrapper[4841]: E0313 09:13:23.067357 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:23 crc kubenswrapper[4841]: E0313 09:13:23.168052 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:23 crc kubenswrapper[4841]: E0313 09:13:23.268675 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:23 crc kubenswrapper[4841]: E0313 09:13:23.368822 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:23 crc kubenswrapper[4841]: E0313 09:13:23.469909 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:23 crc kubenswrapper[4841]: E0313 09:13:23.570962 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:23 crc kubenswrapper[4841]: E0313 09:13:23.671327 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:23 crc kubenswrapper[4841]: E0313 09:13:23.771894 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:23 crc kubenswrapper[4841]: E0313 09:13:23.872919 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:23 crc kubenswrapper[4841]: E0313 09:13:23.973894 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:24 crc kubenswrapper[4841]: E0313 09:13:24.074637 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:24 crc kubenswrapper[4841]: E0313 09:13:24.174792 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:24 crc kubenswrapper[4841]: E0313 09:13:24.275326 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:24 crc kubenswrapper[4841]: E0313 09:13:24.376381 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:24 crc kubenswrapper[4841]: E0313 09:13:24.476742 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:24 crc kubenswrapper[4841]: E0313 09:13:24.577834 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:24 crc kubenswrapper[4841]: E0313 09:13:24.654458 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.659601 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.659651 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.659670 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.659697 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.659713 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:24Z","lastTransitionTime":"2026-03-13T09:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:24 crc kubenswrapper[4841]: E0313 09:13:24.674044 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.678394 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.678435 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.678453 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.678477 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.678493 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:24Z","lastTransitionTime":"2026-03-13T09:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:24 crc kubenswrapper[4841]: E0313 09:13:24.690703 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.695621 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.695662 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.695681 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.695703 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.695721 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:24Z","lastTransitionTime":"2026-03-13T09:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:24 crc kubenswrapper[4841]: E0313 09:13:24.710716 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.715574 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.715619 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.715636 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.715741 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:24 crc kubenswrapper[4841]: I0313 09:13:24.715767 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:24Z","lastTransitionTime":"2026-03-13T09:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:24 crc kubenswrapper[4841]: E0313 09:13:24.731246 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:24 crc kubenswrapper[4841]: E0313 09:13:24.731496 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 09:13:24 crc kubenswrapper[4841]: E0313 09:13:24.731534 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:24 crc kubenswrapper[4841]: E0313 09:13:24.831992 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:24 crc kubenswrapper[4841]: E0313 09:13:24.932854 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:25 crc kubenswrapper[4841]: E0313 09:13:25.033633 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:25 crc kubenswrapper[4841]: E0313 09:13:25.134687 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:25 crc kubenswrapper[4841]: E0313 09:13:25.234779 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:25 crc kubenswrapper[4841]: E0313 09:13:25.334934 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:25 crc kubenswrapper[4841]: E0313 09:13:25.435814 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:25 crc kubenswrapper[4841]: I0313 09:13:25.499598 4841 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 09:13:25 crc kubenswrapper[4841]: E0313 09:13:25.536738 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:25 crc kubenswrapper[4841]: E0313 09:13:25.637653 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:25 crc kubenswrapper[4841]: E0313 09:13:25.738822 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:25 crc kubenswrapper[4841]: E0313 09:13:25.839953 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:25 crc kubenswrapper[4841]: E0313 09:13:25.940802 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:26 crc kubenswrapper[4841]: E0313 09:13:26.041131 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:26 crc kubenswrapper[4841]: E0313 09:13:26.142111 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:26 crc kubenswrapper[4841]: I0313 09:13:26.153689 4841 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 09:13:26 crc kubenswrapper[4841]: E0313 09:13:26.243709 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:26 crc kubenswrapper[4841]: E0313 09:13:26.343828 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:26 crc kubenswrapper[4841]: E0313 09:13:26.444713 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:26 crc kubenswrapper[4841]: E0313 09:13:26.545634 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:26 crc kubenswrapper[4841]: E0313 09:13:26.646436 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:26 crc kubenswrapper[4841]: E0313 09:13:26.747134 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:26 crc kubenswrapper[4841]: E0313 09:13:26.847675 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:26 crc kubenswrapper[4841]: E0313 09:13:26.948023 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:27 crc kubenswrapper[4841]: E0313 09:13:27.048714 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:27 crc kubenswrapper[4841]: E0313 09:13:27.149535 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:27 crc kubenswrapper[4841]: E0313 09:13:27.250110 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:27 crc kubenswrapper[4841]: E0313 09:13:27.351029 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:27 crc kubenswrapper[4841]: E0313 09:13:27.452144 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:27 crc kubenswrapper[4841]: E0313 09:13:27.552494 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:27 crc kubenswrapper[4841]: E0313 09:13:27.653659 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:27 crc kubenswrapper[4841]: E0313 09:13:27.755100 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:27 crc kubenswrapper[4841]: E0313 09:13:27.856144 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:27 crc kubenswrapper[4841]: E0313 09:13:27.957003 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:27 crc kubenswrapper[4841]: I0313 09:13:27.994779 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:27 crc kubenswrapper[4841]: I0313 09:13:27.996313 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:27 crc kubenswrapper[4841]: I0313 09:13:27.996385 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:27 crc kubenswrapper[4841]: I0313 09:13:27.996406 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:27 crc kubenswrapper[4841]: I0313 09:13:27.997428 4841 scope.go:117] "RemoveContainer" containerID="9dcf7400db6da22bb5b4994f992f3e3574479c07ff50b2e4bf4988f4041476d5" Mar 13 09:13:27 crc kubenswrapper[4841]: E0313 09:13:27.997714 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:13:28 crc kubenswrapper[4841]: E0313 09:13:28.057779 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:28 crc kubenswrapper[4841]: E0313 09:13:28.077144 4841 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 09:13:28 crc kubenswrapper[4841]: E0313 09:13:28.158290 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:28 crc kubenswrapper[4841]: E0313 09:13:28.259146 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:28 crc kubenswrapper[4841]: E0313 09:13:28.360429 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:28 crc kubenswrapper[4841]: E0313 09:13:28.460577 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:28 crc kubenswrapper[4841]: E0313 09:13:28.561760 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:28 crc kubenswrapper[4841]: E0313 09:13:28.662160 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:28 crc kubenswrapper[4841]: E0313 09:13:28.762637 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:28 crc kubenswrapper[4841]: E0313 09:13:28.863068 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:28 crc kubenswrapper[4841]: E0313 09:13:28.964001 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:29 crc kubenswrapper[4841]: E0313 09:13:29.065025 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:29 crc kubenswrapper[4841]: E0313 09:13:29.165522 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:29 crc kubenswrapper[4841]: E0313 09:13:29.265866 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:29 crc kubenswrapper[4841]: E0313 09:13:29.366535 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:29 crc kubenswrapper[4841]: E0313 09:13:29.466952 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:29 crc kubenswrapper[4841]: E0313 09:13:29.568080 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:29 crc kubenswrapper[4841]: E0313 09:13:29.668388 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:29 crc kubenswrapper[4841]: E0313 09:13:29.768891 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:29 crc kubenswrapper[4841]: E0313 09:13:29.869592 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:29 crc kubenswrapper[4841]: E0313 09:13:29.969869 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:30 crc kubenswrapper[4841]: E0313 09:13:30.070645 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:30 crc kubenswrapper[4841]: E0313 09:13:30.171018 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:30 crc kubenswrapper[4841]: E0313 09:13:30.271452 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:30 crc kubenswrapper[4841]: E0313 09:13:30.371895 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:30 crc kubenswrapper[4841]: E0313 09:13:30.472481 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:30 crc kubenswrapper[4841]: E0313 09:13:30.572693 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:30 crc kubenswrapper[4841]: E0313 09:13:30.673832 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:30 crc kubenswrapper[4841]: E0313 09:13:30.774825 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:30 crc kubenswrapper[4841]: E0313 09:13:30.875203 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:30 crc kubenswrapper[4841]: E0313 09:13:30.976147 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:30 crc kubenswrapper[4841]: I0313 09:13:30.994882 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:30 crc kubenswrapper[4841]: I0313 09:13:30.996348 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:30 crc kubenswrapper[4841]: I0313 09:13:30.996409 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:30 crc kubenswrapper[4841]: I0313 09:13:30.996427 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:31 crc kubenswrapper[4841]: E0313 09:13:31.076979 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:31 crc kubenswrapper[4841]: E0313 09:13:31.177345 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:31 crc kubenswrapper[4841]: E0313 09:13:31.278581 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:31 crc kubenswrapper[4841]: E0313 09:13:31.380228 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:31 crc kubenswrapper[4841]: E0313 09:13:31.481024 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:31 crc kubenswrapper[4841]: E0313 09:13:31.581869 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:31 crc kubenswrapper[4841]: E0313 09:13:31.682484 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:31 crc kubenswrapper[4841]: E0313 09:13:31.783020 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:31 crc kubenswrapper[4841]: E0313 09:13:31.883687 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:31 crc kubenswrapper[4841]: E0313 09:13:31.984671 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:32 crc kubenswrapper[4841]: E0313 09:13:32.085701 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:32 crc kubenswrapper[4841]: E0313 09:13:32.186648 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:32 crc kubenswrapper[4841]: E0313 09:13:32.287063 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:32 crc kubenswrapper[4841]: E0313 09:13:32.388021 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:32 crc kubenswrapper[4841]: E0313 09:13:32.488935 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:32 crc kubenswrapper[4841]: E0313 09:13:32.589871 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:32 crc kubenswrapper[4841]: E0313 09:13:32.690696 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:32 crc kubenswrapper[4841]: E0313 09:13:32.791559 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:32 crc kubenswrapper[4841]: E0313 09:13:32.891869 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:32 crc kubenswrapper[4841]: E0313 09:13:32.992035 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:33 crc kubenswrapper[4841]: E0313 09:13:33.092769 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:33 crc kubenswrapper[4841]: E0313 09:13:33.193644 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:33 crc kubenswrapper[4841]: E0313 09:13:33.293854 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:33 crc kubenswrapper[4841]: E0313 09:13:33.394976 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:33 crc kubenswrapper[4841]: E0313 09:13:33.495634 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:33 crc kubenswrapper[4841]: E0313 09:13:33.596857 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:33 crc kubenswrapper[4841]: E0313 09:13:33.697399 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:33 crc kubenswrapper[4841]: E0313 09:13:33.798214 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:33 crc kubenswrapper[4841]: E0313 09:13:33.899243 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:34 crc kubenswrapper[4841]: E0313 09:13:33.999988 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:34 crc kubenswrapper[4841]: E0313 09:13:34.101036 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:34 crc kubenswrapper[4841]: E0313 09:13:34.201933 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:34 crc kubenswrapper[4841]: E0313 09:13:34.302594 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:34 crc kubenswrapper[4841]: E0313 09:13:34.403027 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:34 crc kubenswrapper[4841]: E0313 09:13:34.504236 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:34 crc kubenswrapper[4841]: E0313 09:13:34.605192 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:34 crc kubenswrapper[4841]: E0313 09:13:34.705616 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:34 crc kubenswrapper[4841]: E0313 09:13:34.805763 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:34 crc kubenswrapper[4841]: E0313 09:13:34.905833 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:35 crc kubenswrapper[4841]: E0313 09:13:35.006231 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:35 crc kubenswrapper[4841]: E0313 09:13:35.105154 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.109849 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.109888 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.109901 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.109918 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.109929 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:35Z","lastTransitionTime":"2026-03-13T09:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:35 crc kubenswrapper[4841]: E0313 09:13:35.124578 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.128880 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.128952 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.129164 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.129194 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.129222 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:35Z","lastTransitionTime":"2026-03-13T09:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:35 crc kubenswrapper[4841]: E0313 09:13:35.144083 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.148026 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.148096 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.148117 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.148140 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.148157 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:35Z","lastTransitionTime":"2026-03-13T09:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:35 crc kubenswrapper[4841]: E0313 09:13:35.161561 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.165584 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.165669 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.165685 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.165702 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:35 crc kubenswrapper[4841]: I0313 09:13:35.165715 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:35Z","lastTransitionTime":"2026-03-13T09:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:35 crc kubenswrapper[4841]: E0313 09:13:35.176836 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:35 crc kubenswrapper[4841]: E0313 09:13:35.176977 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 09:13:35 crc kubenswrapper[4841]: E0313 09:13:35.176999 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:35 crc kubenswrapper[4841]: E0313 09:13:35.277655 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:35 crc kubenswrapper[4841]: E0313 09:13:35.378753 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:35 crc kubenswrapper[4841]: E0313 09:13:35.479447 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:35 crc kubenswrapper[4841]: E0313 09:13:35.580577 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:35 crc kubenswrapper[4841]: E0313 09:13:35.681405 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:35 crc kubenswrapper[4841]: E0313 09:13:35.781571 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:35 crc kubenswrapper[4841]: E0313 09:13:35.881944 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:35 crc kubenswrapper[4841]: E0313 09:13:35.982160 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:36 crc kubenswrapper[4841]: E0313 09:13:36.082846 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:36 crc kubenswrapper[4841]: E0313 09:13:36.183483 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:36 crc kubenswrapper[4841]: E0313 09:13:36.284159 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:36 crc kubenswrapper[4841]: E0313 09:13:36.384425 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:36 crc kubenswrapper[4841]: E0313 09:13:36.485446 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:36 crc kubenswrapper[4841]: E0313 09:13:36.586585 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:36 crc kubenswrapper[4841]: E0313 09:13:36.688002 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:36 crc kubenswrapper[4841]: E0313 09:13:36.788465 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:36 crc kubenswrapper[4841]: E0313 09:13:36.889634 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:36 crc kubenswrapper[4841]: E0313 09:13:36.990414 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:37 crc kubenswrapper[4841]: E0313 09:13:37.091347 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:37 crc kubenswrapper[4841]: E0313 09:13:37.192407 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:37 crc kubenswrapper[4841]: E0313 09:13:37.293863 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:37 crc kubenswrapper[4841]: E0313 09:13:37.394363 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:37 crc kubenswrapper[4841]: E0313 09:13:37.495713 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:37 crc kubenswrapper[4841]: E0313 09:13:37.596677 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:37 crc kubenswrapper[4841]: E0313 09:13:37.697355 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:37 crc kubenswrapper[4841]: E0313 09:13:37.798353 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:37 crc kubenswrapper[4841]: E0313 09:13:37.899068 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:38 crc kubenswrapper[4841]: E0313 09:13:38.000040 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:38 crc kubenswrapper[4841]: E0313 09:13:38.077915 4841 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 09:13:38 crc kubenswrapper[4841]: E0313 09:13:38.100438 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:38 crc kubenswrapper[4841]: E0313 09:13:38.200874 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:38 crc kubenswrapper[4841]: E0313 09:13:38.301240 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:38 crc kubenswrapper[4841]: E0313 09:13:38.401988 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:38 crc kubenswrapper[4841]: E0313 09:13:38.502923 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:38 crc kubenswrapper[4841]: E0313 09:13:38.603724 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:38 crc kubenswrapper[4841]: E0313 09:13:38.704243 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:38 crc kubenswrapper[4841]: E0313 09:13:38.804875 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:38 crc kubenswrapper[4841]: E0313 09:13:38.905743 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:38 crc kubenswrapper[4841]: I0313 09:13:38.994944 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:38 crc kubenswrapper[4841]: I0313 09:13:38.996676 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:38 crc kubenswrapper[4841]: I0313 09:13:38.996758 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:38 crc kubenswrapper[4841]: I0313 09:13:38.996791 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:38 crc kubenswrapper[4841]: I0313 09:13:38.997961 4841 scope.go:117] "RemoveContainer" containerID="9dcf7400db6da22bb5b4994f992f3e3574479c07ff50b2e4bf4988f4041476d5" Mar 13 09:13:38 crc kubenswrapper[4841]: E0313 09:13:38.998333 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:13:39 crc kubenswrapper[4841]: E0313 09:13:39.005885 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:39 crc kubenswrapper[4841]: E0313 09:13:39.106368 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:39 crc kubenswrapper[4841]: E0313 09:13:39.206648 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:39 crc kubenswrapper[4841]: E0313 09:13:39.307672 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:39 crc kubenswrapper[4841]: E0313 09:13:39.408328 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:39 crc kubenswrapper[4841]: E0313 09:13:39.509602 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:39 crc kubenswrapper[4841]: E0313 09:13:39.610734 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:39 crc kubenswrapper[4841]: E0313 09:13:39.710966 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:39 crc kubenswrapper[4841]: E0313 09:13:39.811982 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:39 crc kubenswrapper[4841]: E0313 09:13:39.913393 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:40 crc kubenswrapper[4841]: E0313 09:13:40.014079 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:40 crc kubenswrapper[4841]: E0313 09:13:40.115064 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:40 crc kubenswrapper[4841]: E0313 09:13:40.215607 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:40 crc kubenswrapper[4841]: E0313 09:13:40.316058 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:40 crc kubenswrapper[4841]: E0313 09:13:40.416440 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:40 crc kubenswrapper[4841]: E0313 09:13:40.517360 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:40 crc kubenswrapper[4841]: E0313 09:13:40.618518 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:40 crc kubenswrapper[4841]: E0313 09:13:40.718736 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:40 crc kubenswrapper[4841]: E0313 09:13:40.819818 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:40 crc kubenswrapper[4841]: E0313 09:13:40.920354 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:41 crc kubenswrapper[4841]: E0313 09:13:41.021173 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:41 crc kubenswrapper[4841]: E0313 09:13:41.122061 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:41 crc kubenswrapper[4841]: E0313 09:13:41.223416 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:41 crc kubenswrapper[4841]: E0313 09:13:41.324465 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:41 crc kubenswrapper[4841]: E0313 09:13:41.425815 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:41 crc kubenswrapper[4841]: E0313 09:13:41.526537 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:41 crc kubenswrapper[4841]: E0313 09:13:41.627560 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:41 crc kubenswrapper[4841]: E0313 09:13:41.727695 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:41 crc kubenswrapper[4841]: E0313 09:13:41.828369 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:41 crc kubenswrapper[4841]: E0313 09:13:41.928996 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:42 crc kubenswrapper[4841]: E0313 09:13:42.030001 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:42 crc kubenswrapper[4841]: E0313 09:13:42.131233 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:42 crc kubenswrapper[4841]: E0313 09:13:42.231438 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:42 crc kubenswrapper[4841]: E0313 09:13:42.331600 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:42 crc kubenswrapper[4841]: E0313 09:13:42.431816 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:42 crc kubenswrapper[4841]: E0313 09:13:42.532213 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:42 crc kubenswrapper[4841]: E0313 09:13:42.632956 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:42 crc kubenswrapper[4841]: E0313 09:13:42.733090 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:42 crc kubenswrapper[4841]: E0313 09:13:42.833543 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:42 crc kubenswrapper[4841]: E0313 09:13:42.934471 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:43 crc kubenswrapper[4841]: E0313 09:13:43.034768 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:43 crc kubenswrapper[4841]: E0313 09:13:43.134920 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:43 crc kubenswrapper[4841]: E0313 09:13:43.235955 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:43 crc kubenswrapper[4841]: E0313 09:13:43.336978 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:43 crc kubenswrapper[4841]: E0313 09:13:43.438148 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:43 crc kubenswrapper[4841]: E0313 09:13:43.538954 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:43 crc kubenswrapper[4841]: E0313 09:13:43.639932 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:43 crc kubenswrapper[4841]: E0313 09:13:43.740924 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:43 crc kubenswrapper[4841]: E0313 09:13:43.842088 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:43 crc kubenswrapper[4841]: E0313 09:13:43.942829 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:44 crc kubenswrapper[4841]: E0313 09:13:44.043687 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:44 crc kubenswrapper[4841]: E0313 09:13:44.143911 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:44 crc kubenswrapper[4841]: E0313 09:13:44.244658 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:44 crc kubenswrapper[4841]: E0313 09:13:44.345041 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:44 crc kubenswrapper[4841]: E0313 09:13:44.445355 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:44 crc kubenswrapper[4841]: E0313 09:13:44.545898 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:44 crc kubenswrapper[4841]: E0313 09:13:44.646619 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:44 crc kubenswrapper[4841]: E0313 09:13:44.747382 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:44 crc kubenswrapper[4841]: E0313 09:13:44.848188 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:44 crc kubenswrapper[4841]: E0313 09:13:44.949336 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:45 crc kubenswrapper[4841]: E0313 09:13:45.049722 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:45 crc kubenswrapper[4841]: E0313 09:13:45.150354 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:45 crc kubenswrapper[4841]: E0313 09:13:45.251450 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:45 crc kubenswrapper[4841]: E0313 09:13:45.351756 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:45 crc kubenswrapper[4841]: E0313 09:13:45.389026 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.394513 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.394580 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.394602 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.394629 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.394649 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:45Z","lastTransitionTime":"2026-03-13T09:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:45 crc kubenswrapper[4841]: E0313 09:13:45.411322 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.415165 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.415227 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.415250 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.415369 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.415481 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:45Z","lastTransitionTime":"2026-03-13T09:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:45 crc kubenswrapper[4841]: E0313 09:13:45.428076 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.431569 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.431619 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.431638 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.431662 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.431678 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:45Z","lastTransitionTime":"2026-03-13T09:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:45 crc kubenswrapper[4841]: E0313 09:13:45.443694 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.446922 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.446967 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.446978 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.446995 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:45 crc kubenswrapper[4841]: I0313 09:13:45.447005 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:45Z","lastTransitionTime":"2026-03-13T09:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:45 crc kubenswrapper[4841]: E0313 09:13:45.457742 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:45 crc kubenswrapper[4841]: E0313 09:13:45.457896 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 09:13:45 crc kubenswrapper[4841]: E0313 09:13:45.457919 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:45 crc kubenswrapper[4841]: E0313 09:13:45.558845 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:45 crc kubenswrapper[4841]: E0313 09:13:45.659619 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:45 crc kubenswrapper[4841]: E0313 09:13:45.760630 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:45 crc kubenswrapper[4841]: E0313 09:13:45.860993 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:45 crc kubenswrapper[4841]: E0313 09:13:45.961948 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:46 crc kubenswrapper[4841]: E0313 09:13:46.062921 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:46 crc kubenswrapper[4841]: E0313 09:13:46.164046 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:46 crc kubenswrapper[4841]: E0313 09:13:46.264780 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:46 crc kubenswrapper[4841]: E0313 09:13:46.364909 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:46 crc kubenswrapper[4841]: E0313 09:13:46.465458 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:46 crc kubenswrapper[4841]: E0313 09:13:46.566129 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:46 crc kubenswrapper[4841]: E0313 09:13:46.666888 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:46 crc kubenswrapper[4841]: E0313 09:13:46.767611 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:46 crc kubenswrapper[4841]: E0313 09:13:46.868293 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:46 crc kubenswrapper[4841]: E0313 09:13:46.968693 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:47 crc kubenswrapper[4841]: E0313 09:13:47.069818 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:47 crc kubenswrapper[4841]: E0313 09:13:47.170989 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:47 crc kubenswrapper[4841]: E0313 09:13:47.271883 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:47 crc kubenswrapper[4841]: E0313 09:13:47.373024 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:47 crc kubenswrapper[4841]: E0313 09:13:47.473510 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:47 crc kubenswrapper[4841]: E0313 09:13:47.574597 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:47 crc kubenswrapper[4841]: E0313 09:13:47.675750 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:47 crc kubenswrapper[4841]: E0313 09:13:47.776800 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:47 crc kubenswrapper[4841]: E0313 09:13:47.877230 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:47 crc kubenswrapper[4841]: E0313 09:13:47.977863 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:48 crc kubenswrapper[4841]: E0313 09:13:48.079610 4841 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 09:13:48 crc kubenswrapper[4841]: E0313 09:13:48.079615 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:48 crc kubenswrapper[4841]: E0313 09:13:48.180530 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:48 crc kubenswrapper[4841]: E0313 09:13:48.281428 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:48 crc kubenswrapper[4841]: E0313 09:13:48.382647 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:48 crc kubenswrapper[4841]: E0313 09:13:48.483853 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:48 crc kubenswrapper[4841]: E0313 09:13:48.584894 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:48 crc kubenswrapper[4841]: E0313 09:13:48.685782 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:48 crc kubenswrapper[4841]: E0313 09:13:48.786832 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:48 crc kubenswrapper[4841]: E0313 09:13:48.887566 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:48 crc kubenswrapper[4841]: E0313 09:13:48.988722 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:49 crc kubenswrapper[4841]: E0313 09:13:49.089492 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:49 crc kubenswrapper[4841]: E0313 09:13:49.190524 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:49 crc kubenswrapper[4841]: E0313 09:13:49.291396 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:49 crc kubenswrapper[4841]: E0313 09:13:49.392424 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:49 crc kubenswrapper[4841]: E0313 09:13:49.493616 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:49 crc kubenswrapper[4841]: E0313 09:13:49.594414 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:49 crc kubenswrapper[4841]: E0313 09:13:49.695437 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:49 crc kubenswrapper[4841]: E0313 09:13:49.796556 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:49 crc kubenswrapper[4841]: E0313 09:13:49.897349 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:49 crc kubenswrapper[4841]: E0313 09:13:49.997725 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:50 crc kubenswrapper[4841]: E0313 09:13:50.098879 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:50 crc kubenswrapper[4841]: E0313 09:13:50.199802 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:50 crc kubenswrapper[4841]: E0313 09:13:50.300158 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:50 crc kubenswrapper[4841]: E0313 09:13:50.400537 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:50 crc kubenswrapper[4841]: E0313 09:13:50.500816 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:50 crc kubenswrapper[4841]: E0313 09:13:50.601613 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:50 crc kubenswrapper[4841]: E0313 09:13:50.702336 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:50 crc kubenswrapper[4841]: E0313 09:13:50.802471 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:50 crc kubenswrapper[4841]: E0313 09:13:50.903182 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:51 crc kubenswrapper[4841]: E0313 09:13:51.004125 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:51 crc kubenswrapper[4841]: E0313 09:13:51.104562 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:51 crc kubenswrapper[4841]: E0313 09:13:51.205656 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:51 crc kubenswrapper[4841]: E0313 09:13:51.306755 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:51 crc kubenswrapper[4841]: E0313 09:13:51.407672 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:51 crc kubenswrapper[4841]: E0313 09:13:51.508581 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:51 crc kubenswrapper[4841]: E0313 09:13:51.609525 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:51 crc kubenswrapper[4841]: E0313 09:13:51.710198 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:51 crc kubenswrapper[4841]: E0313 09:13:51.810867 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:51 crc kubenswrapper[4841]: E0313 09:13:51.911085 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:52 crc kubenswrapper[4841]: E0313 09:13:52.011442 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:52 crc kubenswrapper[4841]: E0313 09:13:52.112432 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:52 crc kubenswrapper[4841]: E0313 09:13:52.212667 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:52 crc kubenswrapper[4841]: E0313 09:13:52.313863 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:52 crc kubenswrapper[4841]: E0313 09:13:52.414832 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:52 crc kubenswrapper[4841]: E0313 09:13:52.515743 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:52 crc kubenswrapper[4841]: E0313 09:13:52.616681 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:52 crc kubenswrapper[4841]: E0313 09:13:52.717833 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:52 crc kubenswrapper[4841]: E0313 09:13:52.818351 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:52 crc kubenswrapper[4841]: E0313 09:13:52.918572 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:52 crc kubenswrapper[4841]: I0313 09:13:52.994756 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:52 crc kubenswrapper[4841]: I0313 09:13:52.995843 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:52 crc kubenswrapper[4841]: I0313 09:13:52.996211 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:52 crc kubenswrapper[4841]: I0313 09:13:52.996476 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:52 crc kubenswrapper[4841]: I0313 09:13:52.997665 4841 scope.go:117] "RemoveContainer" containerID="9dcf7400db6da22bb5b4994f992f3e3574479c07ff50b2e4bf4988f4041476d5" Mar 13 09:13:53 crc kubenswrapper[4841]: E0313 09:13:53.019036 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:53 crc kubenswrapper[4841]: E0313 09:13:53.119880 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:53 crc kubenswrapper[4841]: E0313 09:13:53.220214 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:53 crc kubenswrapper[4841]: E0313 09:13:53.320763 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:53 crc kubenswrapper[4841]: E0313 09:13:53.421171 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:53 crc kubenswrapper[4841]: E0313 09:13:53.522211 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:53 crc kubenswrapper[4841]: E0313 09:13:53.623098 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:53 crc kubenswrapper[4841]: E0313 09:13:53.723400 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:53 crc kubenswrapper[4841]: E0313 09:13:53.824024 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:53 crc kubenswrapper[4841]: E0313 09:13:53.924479 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:53 crc kubenswrapper[4841]: I0313 09:13:53.967050 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 09:13:53 crc kubenswrapper[4841]: I0313 09:13:53.969536 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec"} Mar 13 09:13:53 crc kubenswrapper[4841]: I0313 09:13:53.969802 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:53 crc kubenswrapper[4841]: I0313 09:13:53.971605 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:53 crc kubenswrapper[4841]: I0313 09:13:53.971634 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:53 crc kubenswrapper[4841]: I0313 09:13:53.971642 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:54 crc kubenswrapper[4841]: E0313 09:13:54.024916 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:54 crc kubenswrapper[4841]: E0313 09:13:54.125557 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:54 crc kubenswrapper[4841]: E0313 09:13:54.226324 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:54 crc kubenswrapper[4841]: E0313 09:13:54.327112 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:54 crc kubenswrapper[4841]: E0313 09:13:54.428000 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:54 crc kubenswrapper[4841]: E0313 09:13:54.528857 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:54 crc kubenswrapper[4841]: E0313 09:13:54.629740 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:54 crc kubenswrapper[4841]: E0313 09:13:54.730857 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:54 crc kubenswrapper[4841]: E0313 09:13:54.831357 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:54 crc kubenswrapper[4841]: E0313 09:13:54.931745 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:54 crc kubenswrapper[4841]: I0313 09:13:54.975845 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 13 09:13:54 crc kubenswrapper[4841]: I0313 09:13:54.976758 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 09:13:54 crc kubenswrapper[4841]: I0313 09:13:54.980109 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec" exitCode=255 Mar 13 09:13:54 crc kubenswrapper[4841]: I0313 09:13:54.980173 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec"} Mar 13 09:13:54 crc kubenswrapper[4841]: I0313 09:13:54.980232 4841 scope.go:117] "RemoveContainer" containerID="9dcf7400db6da22bb5b4994f992f3e3574479c07ff50b2e4bf4988f4041476d5" Mar 13 09:13:54 crc kubenswrapper[4841]: I0313 09:13:54.980492 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:54 crc kubenswrapper[4841]: I0313 09:13:54.981976 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:54 crc kubenswrapper[4841]: I0313 09:13:54.982032 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:54 crc kubenswrapper[4841]: I0313 09:13:54.982057 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:54 crc kubenswrapper[4841]: I0313 09:13:54.984246 4841 scope.go:117] "RemoveContainer" containerID="b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec" Mar 13 09:13:54 crc kubenswrapper[4841]: E0313 09:13:54.984647 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:13:55 crc kubenswrapper[4841]: E0313 09:13:55.031868 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:55 crc kubenswrapper[4841]: E0313 09:13:55.132718 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:55 crc kubenswrapper[4841]: E0313 09:13:55.233521 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:55 crc kubenswrapper[4841]: E0313 09:13:55.334045 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:55 crc kubenswrapper[4841]: E0313 09:13:55.435236 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.480852 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:13:55 crc kubenswrapper[4841]: E0313 09:13:55.536677 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:55 crc kubenswrapper[4841]: E0313 09:13:55.637694 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:55 crc kubenswrapper[4841]: E0313 09:13:55.738525 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:55 crc kubenswrapper[4841]: E0313 09:13:55.782125 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.787240 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.787439 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.787526 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.787619 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.787708 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:55Z","lastTransitionTime":"2026-03-13T09:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:55 crc kubenswrapper[4841]: E0313 09:13:55.799823 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.806086 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.806358 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.806564 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.806764 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.806949 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:55Z","lastTransitionTime":"2026-03-13T09:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:55 crc kubenswrapper[4841]: E0313 09:13:55.822809 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.826071 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.826246 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.826416 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.826567 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.826700 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:55Z","lastTransitionTime":"2026-03-13T09:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:55 crc kubenswrapper[4841]: E0313 09:13:55.836396 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.840096 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.840132 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.840144 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.840160 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.840171 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:13:55Z","lastTransitionTime":"2026-03-13T09:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:13:55 crc kubenswrapper[4841]: E0313 09:13:55.851838 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:13:55 crc kubenswrapper[4841]: E0313 09:13:55.851949 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 09:13:55 crc kubenswrapper[4841]: E0313 09:13:55.851972 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:55 crc kubenswrapper[4841]: E0313 09:13:55.952038 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.987518 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.990705 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.991954 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.992019 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.992044 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:13:55 crc kubenswrapper[4841]: I0313 09:13:55.993041 4841 scope.go:117] "RemoveContainer" containerID="b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec" Mar 13 09:13:55 crc kubenswrapper[4841]: E0313 09:13:55.993372 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:13:56 crc kubenswrapper[4841]: E0313 09:13:56.052648 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:56 crc kubenswrapper[4841]: E0313 09:13:56.153635 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:56 crc kubenswrapper[4841]: E0313 09:13:56.254319 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:56 crc kubenswrapper[4841]: E0313 09:13:56.355419 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:56 crc kubenswrapper[4841]: E0313 09:13:56.456346 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:56 crc kubenswrapper[4841]: E0313 09:13:56.557095 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:56 crc kubenswrapper[4841]: E0313 09:13:56.657953 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:56 crc kubenswrapper[4841]: E0313 09:13:56.758721 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:56 crc kubenswrapper[4841]: E0313 09:13:56.859440 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:56 crc kubenswrapper[4841]: E0313 09:13:56.959973 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:57 crc kubenswrapper[4841]: E0313 09:13:57.060476 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:57 crc kubenswrapper[4841]: E0313 09:13:57.160830 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:57 crc kubenswrapper[4841]: E0313 09:13:57.261930 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:57 crc kubenswrapper[4841]: E0313 09:13:57.362387 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:57 crc kubenswrapper[4841]: E0313 09:13:57.463026 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:57 crc kubenswrapper[4841]: E0313 09:13:57.564043 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:57 crc kubenswrapper[4841]: E0313 09:13:57.664916 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:57 crc kubenswrapper[4841]: E0313 09:13:57.765246 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:57 crc kubenswrapper[4841]: E0313 09:13:57.866037 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 09:13:57 crc kubenswrapper[4841]: E0313 09:13:57.966989 4841 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 13 09:13:58 crc kubenswrapper[4841]: E0313 09:13:58.080447 4841 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 09:13:58 crc kubenswrapper[4841]: E0313 09:13:58.090551 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:13:59 crc kubenswrapper[4841]: I0313 09:13:59.994799 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:13:59 crc kubenswrapper[4841]: I0313 09:13:59.996368 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:13:59 crc kubenswrapper[4841]: I0313 09:13:59.996431 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:13:59 crc kubenswrapper[4841]: I0313 09:13:59.996456 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:00 crc kubenswrapper[4841]: I0313 09:14:00.268607 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:14:00 crc kubenswrapper[4841]: I0313 09:14:00.268947 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 09:14:00 crc kubenswrapper[4841]: I0313 09:14:00.270521 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:00 crc kubenswrapper[4841]: I0313 09:14:00.270561 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:00 crc kubenswrapper[4841]: I0313 09:14:00.270576 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:00 crc kubenswrapper[4841]: I0313 09:14:00.271462 4841 scope.go:117] "RemoveContainer" containerID="b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec" Mar 13 09:14:00 crc kubenswrapper[4841]: E0313 09:14:00.271719 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.077107 4841 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 09:14:03 crc kubenswrapper[4841]: E0313 09:14:03.092407 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.971815 4841 apiserver.go:52] "Watching apiserver" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.979240 4841 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.979897 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-zlzg8","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-ovn-kubernetes/ovnkube-node-j5szf","openshift-machine-config-operator/machine-config-daemon-h227v","openshift-multus/multus-qkpgl","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-multus/multus-additional-cni-plugins-948g6","openshift-multus/network-metrics-daemon-5t7sg","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw","openshift-dns/node-resolver-5lxzk","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-operator/iptables-alerter-4ln5h"] Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.980480 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.980684 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.980744 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.980772 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.980834 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:03 crc kubenswrapper[4841]: E0313 09:14:03.981638 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:03 crc kubenswrapper[4841]: E0313 09:14:03.981677 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.981752 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:03 crc kubenswrapper[4841]: E0313 09:14:03.981873 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.982102 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zlzg8" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.982405 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.982533 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qkpgl" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.982862 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.982919 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:03 crc kubenswrapper[4841]: E0313 09:14:03.983021 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.983438 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.983450 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.983483 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5lxzk" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.988155 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.988164 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.988238 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.988487 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.988589 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.988669 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.993164 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.995665 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.997227 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.997301 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.997530 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.997660 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.997907 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.997945 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.998113 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.998140 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.998329 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.998404 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.998437 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.998612 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.998901 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.998909 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.999021 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.999176 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.999199 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.999331 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.999427 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.999536 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.999654 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.999668 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 09:14:03 crc kubenswrapper[4841]: I0313 09:14:03.999759 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.000202 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.000558 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.000650 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.000894 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.001555 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.001701 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.035186 4841 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.065089 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.078088 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.091082 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.100714 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.111496 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.124988 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125022 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125046 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125063 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125078 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125095 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125111 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125130 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125149 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125218 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125246 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125281 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125236 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125304 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125320 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125339 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125355 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125369 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125385 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125401 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125416 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125431 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125446 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125465 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125481 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125496 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125512 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125528 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125543 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125532 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125559 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125668 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125702 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125718 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125730 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125788 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125816 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125853 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125884 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125947 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125967 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.125992 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126015 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126011 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126007 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126040 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126122 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126134 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126176 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126221 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126564 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126616 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126661 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126716 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.127083 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.127243 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.127955 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128012 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128064 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128121 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128291 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128357 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128533 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128605 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128661 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128704 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128741 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128781 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128818 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128855 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128890 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128926 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128964 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128998 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.129042 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.129092 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.129128 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.129218 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.130406 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.130530 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126336 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.130591 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126353 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126358 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126374 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126425 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126850 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126954 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.126965 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.127042 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.127097 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.127239 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.127330 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.127358 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.127373 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.127535 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.127716 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.127712 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.127778 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128709 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128740 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128790 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.128974 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.129195 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.129260 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.129716 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.129779 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.130157 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.130293 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.130391 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.130512 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.130556 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.130546 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.130864 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.130645 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.131294 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.131322 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.131345 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.131357 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.131368 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.131466 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.131594 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.131653 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.131717 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.131768 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.131808 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.131863 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.131920 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.131961 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132007 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132055 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132112 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132308 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132361 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132399 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132443 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132490 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132542 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132628 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132675 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132721 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132767 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132822 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132868 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132905 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132957 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.133008 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.133046 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.133092 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.133212 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.133259 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.133716 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.133780 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.133823 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.133908 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.133956 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134013 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134052 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134096 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134158 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134196 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134241 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134330 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134376 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134421 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134481 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134534 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134586 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134649 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134708 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134835 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134935 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.135145 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.135249 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.135355 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.135423 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.135513 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.135597 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.135687 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.135773 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.135837 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.135907 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.135988 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.136067 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.136214 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.136373 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.136587 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.136674 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.136755 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.136818 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.137447 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.137716 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.138858 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.139167 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.139957 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.131598 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.146585 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.131701 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.131996 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132109 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.132530 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.133125 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.133239 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134603 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134668 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134855 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.134890 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.135235 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.135474 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.135621 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.135704 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.135910 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.136076 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.136100 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.136279 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.136956 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.137375 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.137612 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.137799 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.137813 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.138523 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.138540 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.138601 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.138614 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.138703 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.138686 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.139121 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.139212 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.138407 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.139582 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.139798 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.139934 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.139946 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.140063 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.140100 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.140208 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.140343 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.140552 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.140648 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.140654 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.141570 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.141616 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.141890 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.142043 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.142072 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.142553 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.145600 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.145878 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.145939 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.146362 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.146375 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.146379 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.146427 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.146441 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:14:04.642597654 +0000 UTC m=+127.372497845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.147147 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.147211 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.147401 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.147408 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.146704 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.147327 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.148376 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.148407 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.148489 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.148528 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.148146 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.148882 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.146542 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.146561 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.146556 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.146573 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.146600 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.146610 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.149586 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.146613 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.146850 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.146106 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.146885 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.149133 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.149672 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.149689 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.149401 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.149947 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.150167 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.150174 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.150231 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.150248 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.150347 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.150629 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.150659 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.150720 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.150886 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.150897 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.150910 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.150979 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.151033 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.151074 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.151102 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.151125 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.151184 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.151345 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.151443 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.151769 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.151884 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.152376 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.152478 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.152515 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.152768 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.152861 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.152888 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.153003 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.153001 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.153042 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.153310 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.153370 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.153298 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.153615 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.153750 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.153798 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.153882 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.153948 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.153973 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.153993 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154017 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154035 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154053 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154070 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154085 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154101 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154118 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154136 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154159 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154175 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154192 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154173 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154208 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154322 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154344 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154371 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154411 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154397 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154604 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154672 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154722 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154747 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154864 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-host-var-lib-cni-multus\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154982 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/938e7b48-e79d-485d-abe7-d8cf5beeeb4c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r2tpw\" (UID: \"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.155232 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5388897d-03d2-4551-a457-515f576d4621-cni-binary-copy\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.155324 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l7xt\" (UniqueName: \"kubernetes.io/projected/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-kube-api-access-5l7xt\") pod \"network-metrics-daemon-5t7sg\" (UID: \"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\") " pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.155398 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-cnibin\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.155438 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-os-release\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.155504 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-host-run-k8s-cni-cncf-io\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.155565 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjtjn\" (UniqueName: \"kubernetes.io/projected/938e7b48-e79d-485d-abe7-d8cf5beeeb4c-kube-api-access-jjtjn\") pod \"ovnkube-control-plane-749d76644c-r2tpw\" (UID: \"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.155597 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-run-ovn-kubernetes\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.155748 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-cni-netd\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.155828 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-hostroot\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.155857 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-etc-kubernetes\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154600 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.154706 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.155049 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.155327 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.155337 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.155403 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.155744 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.155995 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct29g\" (UniqueName: \"kubernetes.io/projected/5978189d-b3a2-408c-b09e-c2b3de0a91b0-kube-api-access-ct29g\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.156037 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-etc-openvswitch\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.156057 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.156039 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.156101 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db500a1d-2be8-49c1-9c9e-af7623d16b15-ovnkube-script-lib\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.156137 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c2120c73-4dda-4576-bad1-48858477b17c-host\") pod \"node-ca-zlzg8\" (UID: \"c2120c73-4dda-4576-bad1-48858477b17c\") " pod="openshift-image-registry/node-ca-zlzg8" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.156196 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-systemd-units\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.155901 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.156252 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-multus-conf-dir\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.156313 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.156326 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.156356 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.156421 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.156682 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.156724 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.156747 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.156840 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.156928 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.156978 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/41c5e4ef-f068-4d97-b4c9-b2085cc97422-hosts-file\") pod \"node-resolver-5lxzk\" (UID: \"41c5e4ef-f068-4d97-b4c9-b2085cc97422\") " pod="openshift-dns/node-resolver-5lxzk" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157018 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-system-cni-dir\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157048 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-kubelet\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157079 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db500a1d-2be8-49c1-9c9e-af7623d16b15-ovn-node-metrics-cert\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157111 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e49b836b-f6cf-4cee-b1be-6bd7864fb7f2-mcd-auth-proxy-config\") pod \"machine-config-daemon-h227v\" (UID: \"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\") " pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157143 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-host-var-lib-cni-bin\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157198 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157221 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157244 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5388897d-03d2-4551-a457-515f576d4621-system-cni-dir\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157370 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs\") pod \"network-metrics-daemon-5t7sg\" (UID: \"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\") " pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157436 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-run-openvswitch\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157489 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-run-ovn\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157539 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-node-log\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157606 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157645 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5978189d-b3a2-408c-b09e-c2b3de0a91b0-multus-daemon-config\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157682 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-run-systemd\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157699 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157727 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157754 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e49b836b-f6cf-4cee-b1be-6bd7864fb7f2-proxy-tls\") pod \"machine-config-daemon-h227v\" (UID: \"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\") " pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157798 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5388897d-03d2-4551-a457-515f576d4621-tuning-conf-dir\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157823 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5388897d-03d2-4551-a457-515f576d4621-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157848 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-multus-cni-dir\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157872 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/938e7b48-e79d-485d-abe7-d8cf5beeeb4c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r2tpw\" (UID: \"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157895 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-log-socket\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157940 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-cni-bin\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157965 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.157988 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e49b836b-f6cf-4cee-b1be-6bd7864fb7f2-rootfs\") pod \"machine-config-daemon-h227v\" (UID: \"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\") " pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158012 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-multus-socket-dir-parent\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158032 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db500a1d-2be8-49c1-9c9e-af7623d16b15-env-overrides\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.158113 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.158182 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:04.658163278 +0000 UTC m=+127.388063469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158339 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158372 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmc42\" (UniqueName: \"kubernetes.io/projected/e49b836b-f6cf-4cee-b1be-6bd7864fb7f2-kube-api-access-lmc42\") pod \"machine-config-daemon-h227v\" (UID: \"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\") " pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158393 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db500a1d-2be8-49c1-9c9e-af7623d16b15-ovnkube-config\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158412 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-host-var-lib-kubelet\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158441 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kj6b\" (UniqueName: \"kubernetes.io/projected/c2120c73-4dda-4576-bad1-48858477b17c-kube-api-access-2kj6b\") pod \"node-ca-zlzg8\" (UID: \"c2120c73-4dda-4576-bad1-48858477b17c\") " pod="openshift-image-registry/node-ca-zlzg8" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158459 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-slash\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158477 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvg6k\" (UniqueName: \"kubernetes.io/projected/db500a1d-2be8-49c1-9c9e-af7623d16b15-kube-api-access-qvg6k\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158511 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158563 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158657 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mghth\" (UniqueName: \"kubernetes.io/projected/41c5e4ef-f068-4d97-b4c9-b2085cc97422-kube-api-access-mghth\") pod \"node-resolver-5lxzk\" (UID: \"41c5e4ef-f068-4d97-b4c9-b2085cc97422\") " pod="openshift-dns/node-resolver-5lxzk" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158761 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158782 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppb67\" (UniqueName: \"kubernetes.io/projected/5388897d-03d2-4551-a457-515f576d4621-kube-api-access-ppb67\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158850 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-host-run-multus-certs\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158882 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/938e7b48-e79d-485d-abe7-d8cf5beeeb4c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r2tpw\" (UID: \"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158918 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-run-netns\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158948 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-var-lib-openvswitch\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.158945 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159021 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159058 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5388897d-03d2-4551-a457-515f576d4621-cnibin\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159130 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5388897d-03d2-4551-a457-515f576d4621-os-release\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159153 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-host-run-netns\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159181 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c2120c73-4dda-4576-bad1-48858477b17c-serviceca\") pod \"node-ca-zlzg8\" (UID: \"c2120c73-4dda-4576-bad1-48858477b17c\") " pod="openshift-image-registry/node-ca-zlzg8" Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.159204 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159221 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159234 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.159284 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:04.659248992 +0000 UTC m=+127.389149193 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159320 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5978189d-b3a2-408c-b09e-c2b3de0a91b0-cni-binary-copy\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159347 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159436 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159577 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159592 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159603 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159614 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159623 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159633 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159642 4841 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159658 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159667 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159677 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159686 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159704 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159713 4841 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159723 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159731 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159740 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159749 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159758 4841 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159766 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159776 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159784 4841 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.159795 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160513 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160533 4841 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160594 4841 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160606 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160653 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160666 4841 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160677 4841 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160699 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160707 4841 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160717 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160724 4841 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160733 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160741 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160749 4841 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160758 4841 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160767 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160775 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160784 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160798 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160810 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160821 4841 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160835 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160859 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160869 4841 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160878 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160886 4841 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160895 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160904 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160912 4841 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160921 4841 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160929 4841 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160938 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160947 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160956 4841 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160970 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160978 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160988 4841 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.160996 4841 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.161004 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.161013 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.161021 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.161029 4841 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.161038 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.161047 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.161081 4841 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.167437 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.169369 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.169728 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.169786 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.169806 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.169896 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.169916 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.169924 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.170349 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.170351 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.170382 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.170414 4841 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.170154 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.170050 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:04.669962286 +0000 UTC m=+127.399862557 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.170192 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.171769 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.171787 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.171800 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.171848 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:04.671832415 +0000 UTC m=+127.401732606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172082 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172506 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172609 4841 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172626 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172636 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172645 4841 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172654 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172663 4841 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172673 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172683 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172692 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172702 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172711 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172723 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172732 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172752 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172761 4841 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172747 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172770 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172835 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172859 4841 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172879 4841 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172898 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172920 4841 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172940 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.172985 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173065 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173005 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173118 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173138 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173158 4841 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173206 4841 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173228 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173246 4841 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173283 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173302 4841 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173320 4841 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173327 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173338 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173357 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173374 4841 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173391 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173410 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173430 4841 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173448 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173617 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173836 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.173988 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174014 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174032 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174051 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174062 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174070 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174129 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174163 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174188 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174198 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174221 4841 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174258 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174313 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174325 4841 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174333 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174341 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174350 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174358 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174366 4841 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174374 4841 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174382 4841 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174391 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174399 4841 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174407 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174419 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174431 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174444 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174461 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174472 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174482 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174491 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174500 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174509 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174518 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174580 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174589 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174598 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174620 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174629 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174637 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174647 4841 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174658 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174666 4841 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174675 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174683 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174692 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174702 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174710 4841 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174719 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174728 4841 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174737 4841 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174745 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174754 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174763 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174771 4841 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174781 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174789 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174797 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174880 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174970 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.174978 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.175052 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.175125 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.176329 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.176493 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.176527 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.176750 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.178441 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.179469 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.180301 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.180529 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.181046 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.181621 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.181863 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.192872 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.193675 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.197794 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.200102 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.202623 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.220706 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.230433 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.239413 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.249469 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.263197 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.273694 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276116 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-host-var-lib-cni-multus\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276174 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/938e7b48-e79d-485d-abe7-d8cf5beeeb4c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r2tpw\" (UID: \"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276212 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5388897d-03d2-4551-a457-515f576d4621-cni-binary-copy\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276245 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l7xt\" (UniqueName: \"kubernetes.io/projected/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-kube-api-access-5l7xt\") pod \"network-metrics-daemon-5t7sg\" (UID: \"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\") " pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276304 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-cnibin\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276336 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-os-release\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276359 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-host-var-lib-cni-multus\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276365 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-host-run-k8s-cni-cncf-io\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276483 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjtjn\" (UniqueName: \"kubernetes.io/projected/938e7b48-e79d-485d-abe7-d8cf5beeeb4c-kube-api-access-jjtjn\") pod \"ovnkube-control-plane-749d76644c-r2tpw\" (UID: \"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276576 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-run-ovn-kubernetes\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276610 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-cni-netd\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276683 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-hostroot\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276410 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-host-run-k8s-cni-cncf-io\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276744 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-etc-kubernetes\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276805 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct29g\" (UniqueName: \"kubernetes.io/projected/5978189d-b3a2-408c-b09e-c2b3de0a91b0-kube-api-access-ct29g\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276843 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-etc-kubernetes\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276854 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-etc-openvswitch\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276845 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-cni-netd\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276875 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-hostroot\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.276914 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db500a1d-2be8-49c1-9c9e-af7623d16b15-ovnkube-script-lib\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277041 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-etc-openvswitch\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277061 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c2120c73-4dda-4576-bad1-48858477b17c-host\") pod \"node-ca-zlzg8\" (UID: \"c2120c73-4dda-4576-bad1-48858477b17c\") " pod="openshift-image-registry/node-ca-zlzg8" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277104 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c2120c73-4dda-4576-bad1-48858477b17c-host\") pod \"node-ca-zlzg8\" (UID: \"c2120c73-4dda-4576-bad1-48858477b17c\") " pod="openshift-image-registry/node-ca-zlzg8" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277069 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-run-ovn-kubernetes\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277142 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-systemd-units\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277186 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-systemd-units\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277193 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-multus-conf-dir\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277255 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-multus-conf-dir\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277226 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-os-release\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277313 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/41c5e4ef-f068-4d97-b4c9-b2085cc97422-hosts-file\") pod \"node-resolver-5lxzk\" (UID: \"41c5e4ef-f068-4d97-b4c9-b2085cc97422\") " pod="openshift-dns/node-resolver-5lxzk" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277330 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/938e7b48-e79d-485d-abe7-d8cf5beeeb4c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r2tpw\" (UID: \"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277351 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-system-cni-dir\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277384 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-kubelet\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277416 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db500a1d-2be8-49c1-9c9e-af7623d16b15-ovn-node-metrics-cert\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277426 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-system-cni-dir\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277452 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e49b836b-f6cf-4cee-b1be-6bd7864fb7f2-mcd-auth-proxy-config\") pod \"machine-config-daemon-h227v\" (UID: \"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\") " pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277477 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-kubelet\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277526 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-host-var-lib-cni-bin\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277540 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/41c5e4ef-f068-4d97-b4c9-b2085cc97422-hosts-file\") pod \"node-resolver-5lxzk\" (UID: \"41c5e4ef-f068-4d97-b4c9-b2085cc97422\") " pod="openshift-dns/node-resolver-5lxzk" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277563 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5388897d-03d2-4551-a457-515f576d4621-system-cni-dir\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277602 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs\") pod \"network-metrics-daemon-5t7sg\" (UID: \"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\") " pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277610 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-host-var-lib-cni-bin\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277609 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5388897d-03d2-4551-a457-515f576d4621-system-cni-dir\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277562 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5388897d-03d2-4551-a457-515f576d4621-cni-binary-copy\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277639 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-run-openvswitch\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277671 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-cnibin\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277680 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-run-openvswitch\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277698 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-run-ovn\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.277727 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277734 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-run-ovn\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277767 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-node-log\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.277789 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs podName:ea07a392-2a1f-4bae-bb67-db7cd421c1e1 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:04.777768416 +0000 UTC m=+127.507668647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs") pod "network-metrics-daemon-5t7sg" (UID: "ea07a392-2a1f-4bae-bb67-db7cd421c1e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277732 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-node-log\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277827 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277839 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db500a1d-2be8-49c1-9c9e-af7623d16b15-ovnkube-script-lib\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277863 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5978189d-b3a2-408c-b09e-c2b3de0a91b0-multus-daemon-config\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277894 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277897 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-run-systemd\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277933 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-run-systemd\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277955 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.277999 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e49b836b-f6cf-4cee-b1be-6bd7864fb7f2-proxy-tls\") pod \"machine-config-daemon-h227v\" (UID: \"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\") " pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278025 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5388897d-03d2-4551-a457-515f576d4621-tuning-conf-dir\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278047 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5388897d-03d2-4551-a457-515f576d4621-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278073 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278076 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-multus-cni-dir\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278117 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/938e7b48-e79d-485d-abe7-d8cf5beeeb4c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r2tpw\" (UID: \"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278147 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-log-socket\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278154 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e49b836b-f6cf-4cee-b1be-6bd7864fb7f2-mcd-auth-proxy-config\") pod \"machine-config-daemon-h227v\" (UID: \"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\") " pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278169 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-cni-bin\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278192 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e49b836b-f6cf-4cee-b1be-6bd7864fb7f2-rootfs\") pod \"machine-config-daemon-h227v\" (UID: \"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\") " pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278214 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-multus-socket-dir-parent\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278240 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db500a1d-2be8-49c1-9c9e-af7623d16b15-env-overrides\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278260 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278302 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmc42\" (UniqueName: \"kubernetes.io/projected/e49b836b-f6cf-4cee-b1be-6bd7864fb7f2-kube-api-access-lmc42\") pod \"machine-config-daemon-h227v\" (UID: \"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\") " pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278305 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-multus-cni-dir\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278324 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db500a1d-2be8-49c1-9c9e-af7623d16b15-ovnkube-config\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278346 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-host-var-lib-kubelet\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278376 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kj6b\" (UniqueName: \"kubernetes.io/projected/c2120c73-4dda-4576-bad1-48858477b17c-kube-api-access-2kj6b\") pod \"node-ca-zlzg8\" (UID: \"c2120c73-4dda-4576-bad1-48858477b17c\") " pod="openshift-image-registry/node-ca-zlzg8" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278387 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278396 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-slash\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278409 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-multus-socket-dir-parent\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278430 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-slash\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278434 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvg6k\" (UniqueName: \"kubernetes.io/projected/db500a1d-2be8-49c1-9c9e-af7623d16b15-kube-api-access-qvg6k\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278468 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-log-socket\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278517 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-cni-bin\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278522 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mghth\" (UniqueName: \"kubernetes.io/projected/41c5e4ef-f068-4d97-b4c9-b2085cc97422-kube-api-access-mghth\") pod \"node-resolver-5lxzk\" (UID: \"41c5e4ef-f068-4d97-b4c9-b2085cc97422\") " pod="openshift-dns/node-resolver-5lxzk" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278572 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppb67\" (UniqueName: \"kubernetes.io/projected/5388897d-03d2-4551-a457-515f576d4621-kube-api-access-ppb67\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278607 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-host-run-multus-certs\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278613 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-host-var-lib-kubelet\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278640 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/938e7b48-e79d-485d-abe7-d8cf5beeeb4c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r2tpw\" (UID: \"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278671 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-run-netns\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278700 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-var-lib-openvswitch\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278746 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5388897d-03d2-4551-a457-515f576d4621-cnibin\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278774 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5388897d-03d2-4551-a457-515f576d4621-os-release\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278802 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-host-run-netns\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278833 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c2120c73-4dda-4576-bad1-48858477b17c-serviceca\") pod \"node-ca-zlzg8\" (UID: \"c2120c73-4dda-4576-bad1-48858477b17c\") " pod="openshift-image-registry/node-ca-zlzg8" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278862 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5978189d-b3a2-408c-b09e-c2b3de0a91b0-cni-binary-copy\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278931 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.278945 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db500a1d-2be8-49c1-9c9e-af7623d16b15-env-overrides\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279099 4841 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279115 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5388897d-03d2-4551-a457-515f576d4621-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279175 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-var-lib-openvswitch\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279189 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5388897d-03d2-4551-a457-515f576d4621-cnibin\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279204 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-run-netns\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279235 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-host-run-netns\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279240 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e49b836b-f6cf-4cee-b1be-6bd7864fb7f2-rootfs\") pod \"machine-config-daemon-h227v\" (UID: \"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\") " pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279253 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5388897d-03d2-4551-a457-515f576d4621-os-release\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279236 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5978189d-b3a2-408c-b09e-c2b3de0a91b0-host-run-multus-certs\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279211 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db500a1d-2be8-49c1-9c9e-af7623d16b15-ovnkube-config\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279212 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/938e7b48-e79d-485d-abe7-d8cf5beeeb4c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r2tpw\" (UID: \"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279329 4841 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279357 4841 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279378 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279397 4841 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279416 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279435 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279453 4841 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279474 4841 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279492 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279515 4841 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279527 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5978189d-b3a2-408c-b09e-c2b3de0a91b0-cni-binary-copy\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279535 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279572 4841 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279595 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279628 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279645 4841 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279662 4841 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279679 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279702 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279720 4841 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279742 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279760 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279778 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279800 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279817 4841 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279821 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5978189d-b3a2-408c-b09e-c2b3de0a91b0-multus-daemon-config\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279834 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279853 4841 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.279870 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.280924 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.281015 4841 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.280974 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db500a1d-2be8-49c1-9c9e-af7623d16b15-ovn-node-metrics-cert\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.281121 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5388897d-03d2-4551-a457-515f576d4621-tuning-conf-dir\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.281121 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c2120c73-4dda-4576-bad1-48858477b17c-serviceca\") pod \"node-ca-zlzg8\" (UID: \"c2120c73-4dda-4576-bad1-48858477b17c\") " pod="openshift-image-registry/node-ca-zlzg8" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.283525 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e49b836b-f6cf-4cee-b1be-6bd7864fb7f2-proxy-tls\") pod \"machine-config-daemon-h227v\" (UID: \"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\") " pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.284887 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/938e7b48-e79d-485d-abe7-d8cf5beeeb4c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r2tpw\" (UID: \"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.294811 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjtjn\" (UniqueName: \"kubernetes.io/projected/938e7b48-e79d-485d-abe7-d8cf5beeeb4c-kube-api-access-jjtjn\") pod \"ovnkube-control-plane-749d76644c-r2tpw\" (UID: \"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.296766 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mghth\" (UniqueName: \"kubernetes.io/projected/41c5e4ef-f068-4d97-b4c9-b2085cc97422-kube-api-access-mghth\") pod \"node-resolver-5lxzk\" (UID: \"41c5e4ef-f068-4d97-b4c9-b2085cc97422\") " pod="openshift-dns/node-resolver-5lxzk" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.300444 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvg6k\" (UniqueName: \"kubernetes.io/projected/db500a1d-2be8-49c1-9c9e-af7623d16b15-kube-api-access-qvg6k\") pod \"ovnkube-node-j5szf\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.301255 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppb67\" (UniqueName: \"kubernetes.io/projected/5388897d-03d2-4551-a457-515f576d4621-kube-api-access-ppb67\") pod \"multus-additional-cni-plugins-948g6\" (UID: \"5388897d-03d2-4551-a457-515f576d4621\") " pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.302210 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.304488 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l7xt\" (UniqueName: \"kubernetes.io/projected/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-kube-api-access-5l7xt\") pod \"network-metrics-daemon-5t7sg\" (UID: \"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\") " pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.304963 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmc42\" (UniqueName: \"kubernetes.io/projected/e49b836b-f6cf-4cee-b1be-6bd7864fb7f2-kube-api-access-lmc42\") pod \"machine-config-daemon-h227v\" (UID: \"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\") " pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.306184 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct29g\" (UniqueName: \"kubernetes.io/projected/5978189d-b3a2-408c-b09e-c2b3de0a91b0-kube-api-access-ct29g\") pod \"multus-qkpgl\" (UID: \"5978189d-b3a2-408c-b09e-c2b3de0a91b0\") " pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.306867 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kj6b\" (UniqueName: \"kubernetes.io/projected/c2120c73-4dda-4576-bad1-48858477b17c-kube-api-access-2kj6b\") pod \"node-ca-zlzg8\" (UID: \"c2120c73-4dda-4576-bad1-48858477b17c\") " pod="openshift-image-registry/node-ca-zlzg8" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.320232 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.332742 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 09:14:04 crc kubenswrapper[4841]: W0313 09:14:04.341012 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-09948003a8540902c962e8ed6fd8490e1eec8ad999c138113f263ecb877b8977 WatchSource:0}: Error finding container 09948003a8540902c962e8ed6fd8490e1eec8ad999c138113f263ecb877b8977: Status 404 returned error can't find the container with id 09948003a8540902c962e8ed6fd8490e1eec8ad999c138113f263ecb877b8977 Mar 13 09:14:04 crc kubenswrapper[4841]: W0313 09:14:04.349924 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-c3b8855b48980a5034378c02bc18c2ecb6c7ae25047c25a99b39fd5eed0dd1b7 WatchSource:0}: Error finding container c3b8855b48980a5034378c02bc18c2ecb6c7ae25047c25a99b39fd5eed0dd1b7: Status 404 returned error can't find the container with id c3b8855b48980a5034378c02bc18c2ecb6c7ae25047c25a99b39fd5eed0dd1b7 Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.370439 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zlzg8" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.375623 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-948g6" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.382291 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qkpgl" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.388252 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" Mar 13 09:14:04 crc kubenswrapper[4841]: W0313 09:14:04.392287 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5388897d_03d2_4551_a457_515f576d4621.slice/crio-80eecebb7116e92141717857e32871fb854c5c080cd26f76774a06d1730999e8 WatchSource:0}: Error finding container 80eecebb7116e92141717857e32871fb854c5c080cd26f76774a06d1730999e8: Status 404 returned error can't find the container with id 80eecebb7116e92141717857e32871fb854c5c080cd26f76774a06d1730999e8 Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.397380 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:04 crc kubenswrapper[4841]: W0313 09:14:04.400677 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5978189d_b3a2_408c_b09e_c2b3de0a91b0.slice/crio-c27fbede8b47e872398ce586c35cb62f67398f4f10dd8991e61a863cd7de1a59 WatchSource:0}: Error finding container c27fbede8b47e872398ce586c35cb62f67398f4f10dd8991e61a863cd7de1a59: Status 404 returned error can't find the container with id c27fbede8b47e872398ce586c35cb62f67398f4f10dd8991e61a863cd7de1a59 Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.406225 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.411152 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5lxzk" Mar 13 09:14:04 crc kubenswrapper[4841]: W0313 09:14:04.411861 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod938e7b48_e79d_485d_abe7_d8cf5beeeb4c.slice/crio-e526ac0e1d00905cd831028d2f90a7b6c39ba3312d67669826dc3afa40a7372f WatchSource:0}: Error finding container e526ac0e1d00905cd831028d2f90a7b6c39ba3312d67669826dc3afa40a7372f: Status 404 returned error can't find the container with id e526ac0e1d00905cd831028d2f90a7b6c39ba3312d67669826dc3afa40a7372f Mar 13 09:14:04 crc kubenswrapper[4841]: W0313 09:14:04.425713 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb500a1d_2be8_49c1_9c9e_af7623d16b15.slice/crio-745bee14f619dd22807f1c648eeb21f404e12b046757a303a2a093bec2b8def5 WatchSource:0}: Error finding container 745bee14f619dd22807f1c648eeb21f404e12b046757a303a2a093bec2b8def5: Status 404 returned error can't find the container with id 745bee14f619dd22807f1c648eeb21f404e12b046757a303a2a093bec2b8def5 Mar 13 09:14:04 crc kubenswrapper[4841]: W0313 09:14:04.463880 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41c5e4ef_f068_4d97_b4c9_b2085cc97422.slice/crio-e5561186207793336c1a415d4fe5a54924c0ce316488d34fa7bfe13ba24797ce WatchSource:0}: Error finding container e5561186207793336c1a415d4fe5a54924c0ce316488d34fa7bfe13ba24797ce: Status 404 returned error can't find the container with id e5561186207793336c1a415d4fe5a54924c0ce316488d34fa7bfe13ba24797ce Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.686644 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.686723 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.686748 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.686781 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.686805 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.686873 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.686927 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:05.686912115 +0000 UTC m=+128.416812306 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.687002 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:14:05.686989227 +0000 UTC m=+128.416889418 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.687074 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.687114 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:05.68709585 +0000 UTC m=+128.416996041 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.687133 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.687160 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.687172 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.687210 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.687229 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.687243 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.687216 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:05.687200913 +0000 UTC m=+128.417101104 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.687305 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:05.687295116 +0000 UTC m=+128.417195427 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:04 crc kubenswrapper[4841]: I0313 09:14:04.787214 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs\") pod \"network-metrics-daemon-5t7sg\" (UID: \"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\") " pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.787560 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 09:14:04 crc kubenswrapper[4841]: E0313 09:14:04.787802 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs podName:ea07a392-2a1f-4bae-bb67-db7cd421c1e1 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:05.787785848 +0000 UTC m=+128.517686049 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs") pod "network-metrics-daemon-5t7sg" (UID: "ea07a392-2a1f-4bae-bb67-db7cd421c1e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.049731 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.049846 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.049876 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"fb397eb3f4f8412b41a2a4bf5e34c2da5c7a054f3d7602f99eef6d5dea1e2808"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.052625 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.052668 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.052683 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c3b8855b48980a5034378c02bc18c2ecb6c7ae25047c25a99b39fd5eed0dd1b7"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.055235 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.055460 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8a407d7dca09d3515e19fbbc534aaf486a8800d0d69bae31285f5811c4a9b120"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.056842 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zlzg8" event={"ID":"c2120c73-4dda-4576-bad1-48858477b17c","Type":"ContainerStarted","Data":"2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.056977 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zlzg8" event={"ID":"c2120c73-4dda-4576-bad1-48858477b17c","Type":"ContainerStarted","Data":"304504fcd0bcb38f4fa93664a6e9f29a01009d33d75a5a8ea1a326fa2429d937"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.058747 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"09948003a8540902c962e8ed6fd8490e1eec8ad999c138113f263ecb877b8977"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.060151 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5lxzk" event={"ID":"41c5e4ef-f068-4d97-b4c9-b2085cc97422","Type":"ContainerStarted","Data":"7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.060216 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5lxzk" event={"ID":"41c5e4ef-f068-4d97-b4c9-b2085cc97422","Type":"ContainerStarted","Data":"e5561186207793336c1a415d4fe5a54924c0ce316488d34fa7bfe13ba24797ce"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.061547 4841 generic.go:334] "Generic (PLEG): container finished" podID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerID="8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc" exitCode=0 Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.061611 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerDied","Data":"8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.061634 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerStarted","Data":"745bee14f619dd22807f1c648eeb21f404e12b046757a303a2a093bec2b8def5"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.063514 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" event={"ID":"5388897d-03d2-4551-a457-515f576d4621","Type":"ContainerStarted","Data":"d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.063571 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" event={"ID":"5388897d-03d2-4551-a457-515f576d4621","Type":"ContainerStarted","Data":"80eecebb7116e92141717857e32871fb854c5c080cd26f76774a06d1730999e8"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.065215 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qkpgl" event={"ID":"5978189d-b3a2-408c-b09e-c2b3de0a91b0","Type":"ContainerStarted","Data":"efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.065248 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qkpgl" event={"ID":"5978189d-b3a2-408c-b09e-c2b3de0a91b0","Type":"ContainerStarted","Data":"c27fbede8b47e872398ce586c35cb62f67398f4f10dd8991e61a863cd7de1a59"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.069906 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" event={"ID":"938e7b48-e79d-485d-abe7-d8cf5beeeb4c","Type":"ContainerStarted","Data":"3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.069956 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" event={"ID":"938e7b48-e79d-485d-abe7-d8cf5beeeb4c","Type":"ContainerStarted","Data":"a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.069977 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" event={"ID":"938e7b48-e79d-485d-abe7-d8cf5beeeb4c","Type":"ContainerStarted","Data":"e526ac0e1d00905cd831028d2f90a7b6c39ba3312d67669826dc3afa40a7372f"} Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.073155 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.086305 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.099281 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.115061 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.126659 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.139520 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.152228 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.164338 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.175720 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.185131 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.204824 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.233144 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.252924 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.262569 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.307708 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.315441 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.327491 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:05Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.339386 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:05Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.352786 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:05Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.366402 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:05Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.378333 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:05Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.390094 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:05Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.409458 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:05Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.423351 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:05Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.444740 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:05Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.460695 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:05Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.474206 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:05Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.488642 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:05Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.710975 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.711099 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.711167 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.711207 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:14:07.711173711 +0000 UTC m=+130.441073912 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.711332 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:07.711321506 +0000 UTC m=+130.441221767 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.711362 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.711400 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.711461 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.711566 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.711581 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.711591 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.711612 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.711621 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:07.711609365 +0000 UTC m=+130.441509556 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.711652 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:07.711643966 +0000 UTC m=+130.441544257 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.711675 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.711684 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.711690 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.711713 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:07.711704587 +0000 UTC m=+130.441604778 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.813003 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs\") pod \"network-metrics-daemon-5t7sg\" (UID: \"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\") " pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.813118 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.813166 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs podName:ea07a392-2a1f-4bae-bb67-db7cd421c1e1 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:07.813153918 +0000 UTC m=+130.543054109 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs") pod "network-metrics-daemon-5t7sg" (UID: "ea07a392-2a1f-4bae-bb67-db7cd421c1e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.996548 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.998596 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.998762 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.999008 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.999192 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.999316 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:05 crc kubenswrapper[4841]: I0313 09:14:05.999206 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:05 crc kubenswrapper[4841]: E0313 09:14:05.999428 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.002242 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.003464 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.005732 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.007080 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.010581 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.011818 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.013135 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.015174 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.016538 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.018440 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.019463 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.021822 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.022932 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.024050 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.026653 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.027739 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.029637 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.030543 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.031725 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.033830 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.034578 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.035782 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.036373 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.037716 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.038311 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.039141 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.040670 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.041287 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.042579 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.043309 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.044446 4841 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.044580 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.047174 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.048391 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.048996 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.051025 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.051960 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.053152 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.054046 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.055464 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.056100 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.057361 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.058200 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.059596 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.060189 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.061358 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.062077 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.063484 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.064132 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.065226 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.065865 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.067053 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.067857 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.068483 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.076726 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerStarted","Data":"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6"} Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.076881 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerStarted","Data":"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2"} Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.076902 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerStarted","Data":"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14"} Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.076910 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerStarted","Data":"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f"} Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.078121 4841 generic.go:334] "Generic (PLEG): container finished" podID="5388897d-03d2-4551-a457-515f576d4621" containerID="d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062" exitCode=0 Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.078216 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" event={"ID":"5388897d-03d2-4551-a457-515f576d4621","Type":"ContainerDied","Data":"d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062"} Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.081737 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.081766 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.081774 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.081787 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.081796 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:06Z","lastTransitionTime":"2026-03-13T09:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.093776 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:06 crc kubenswrapper[4841]: E0313 09:14:06.101624 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.107028 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.109402 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.110596 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.110630 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.110654 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.110674 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:06Z","lastTransitionTime":"2026-03-13T09:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:06 crc kubenswrapper[4841]: E0313 09:14:06.127638 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.138205 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.139448 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.139482 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.139495 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.139513 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.139525 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:06Z","lastTransitionTime":"2026-03-13T09:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.150811 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:06 crc kubenswrapper[4841]: E0313 09:14:06.152447 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.156170 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.156218 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.156231 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.156246 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.156257 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:06Z","lastTransitionTime":"2026-03-13T09:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.164582 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:06 crc kubenswrapper[4841]: E0313 09:14:06.167802 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.170467 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.170488 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.170495 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.170507 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.170516 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:06Z","lastTransitionTime":"2026-03-13T09:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.177432 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:06 crc kubenswrapper[4841]: E0313 09:14:06.184959 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:06 crc kubenswrapper[4841]: E0313 09:14:06.185185 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.190723 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.201393 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.214979 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.227384 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.238745 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.254281 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.268576 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:06 crc kubenswrapper[4841]: I0313 09:14:06.282105 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:06Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.089833 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerStarted","Data":"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c"} Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.090352 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerStarted","Data":"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82"} Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.092978 4841 generic.go:334] "Generic (PLEG): container finished" podID="5388897d-03d2-4551-a457-515f576d4621" containerID="5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81" exitCode=0 Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.093068 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" event={"ID":"5388897d-03d2-4551-a457-515f576d4621","Type":"ContainerDied","Data":"5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81"} Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.106475 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:07Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.126863 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:07Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.143520 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:07Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.157237 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:07Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.169445 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:07Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.182989 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:07Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.195676 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:07Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.206851 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:07Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.228743 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:07Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.240571 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:07Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.251051 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:07Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.261205 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:07Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.272567 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:07Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.281794 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:07Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.731705 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.731890 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.731948 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.731991 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.732053 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.732198 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.732332 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:11.732249378 +0000 UTC m=+134.462149599 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.732628 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:14:11.732586999 +0000 UTC m=+134.462487230 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.732738 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.732786 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.732813 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.732736 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.732847 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.732910 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:11.732882708 +0000 UTC m=+134.462782989 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.733046 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.733163 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:11.733138885 +0000 UTC m=+134.463039106 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.733453 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.733564 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:11.733546638 +0000 UTC m=+134.463446859 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.833401 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs\") pod \"network-metrics-daemon-5t7sg\" (UID: \"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\") " pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.833701 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.833842 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs podName:ea07a392-2a1f-4bae-bb67-db7cd421c1e1 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:11.833807282 +0000 UTC m=+134.563707583 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs") pod "network-metrics-daemon-5t7sg" (UID: "ea07a392-2a1f-4bae-bb67-db7cd421c1e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.996523 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.996734 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.996770 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.996891 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:07 crc kubenswrapper[4841]: I0313 09:14:07.996936 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.996978 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.997112 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:07 crc kubenswrapper[4841]: E0313 09:14:07.997222 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.029178 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.045429 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.069311 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.093373 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: E0313 09:14:08.093603 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.100153 4841 generic.go:334] "Generic (PLEG): container finished" podID="5388897d-03d2-4551-a457-515f576d4621" containerID="69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94" exitCode=0 Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.100212 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" event={"ID":"5388897d-03d2-4551-a457-515f576d4621","Type":"ContainerDied","Data":"69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94"} Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.105527 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55"} Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.114428 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.127730 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.145908 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.159012 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.169541 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.184321 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.197726 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.211678 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.222869 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.236683 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.254892 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.267512 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.283632 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.303933 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.314781 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.327006 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.342336 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.351914 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.370208 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.383210 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.394916 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.404750 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.415171 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:08 crc kubenswrapper[4841]: I0313 09:14:08.424471 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.111805 4841 generic.go:334] "Generic (PLEG): container finished" podID="5388897d-03d2-4551-a457-515f576d4621" containerID="5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf" exitCode=0 Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.111868 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" event={"ID":"5388897d-03d2-4551-a457-515f576d4621","Type":"ContainerDied","Data":"5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf"} Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.121079 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerStarted","Data":"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407"} Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.134618 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:09Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.152378 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:09Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.172798 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:09Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.187367 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:09Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.200031 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:09Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.214747 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:09Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.231658 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:09Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.244854 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:09Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.262584 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:09Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.278928 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:09Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.293632 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:09Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.303474 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:09Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.318479 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:09Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.331581 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:09Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.994915 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.994945 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.994973 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:09 crc kubenswrapper[4841]: I0313 09:14:09.995015 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:09 crc kubenswrapper[4841]: E0313 09:14:09.995524 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:09 crc kubenswrapper[4841]: E0313 09:14:09.995557 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:09 crc kubenswrapper[4841]: E0313 09:14:09.995805 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:09 crc kubenswrapper[4841]: E0313 09:14:09.995876 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:10 crc kubenswrapper[4841]: I0313 09:14:10.128550 4841 generic.go:334] "Generic (PLEG): container finished" podID="5388897d-03d2-4551-a457-515f576d4621" containerID="8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d" exitCode=0 Mar 13 09:14:10 crc kubenswrapper[4841]: I0313 09:14:10.128605 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" event={"ID":"5388897d-03d2-4551-a457-515f576d4621","Type":"ContainerDied","Data":"8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d"} Mar 13 09:14:10 crc kubenswrapper[4841]: I0313 09:14:10.147714 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:10Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:10 crc kubenswrapper[4841]: I0313 09:14:10.161526 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:10Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:10 crc kubenswrapper[4841]: I0313 09:14:10.176813 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:10Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:10 crc kubenswrapper[4841]: I0313 09:14:10.187934 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:10Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:10 crc kubenswrapper[4841]: I0313 09:14:10.199501 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:10Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:10 crc kubenswrapper[4841]: I0313 09:14:10.213347 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:10Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:10 crc kubenswrapper[4841]: I0313 09:14:10.226374 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:10Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:10 crc kubenswrapper[4841]: I0313 09:14:10.238219 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:10Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:10 crc kubenswrapper[4841]: I0313 09:14:10.251430 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:10Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:10 crc kubenswrapper[4841]: I0313 09:14:10.262824 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:10Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:10 crc kubenswrapper[4841]: I0313 09:14:10.274165 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:10Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:10 crc kubenswrapper[4841]: I0313 09:14:10.285080 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:10Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:10 crc kubenswrapper[4841]: I0313 09:14:10.301706 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:10Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:10 crc kubenswrapper[4841]: I0313 09:14:10.313691 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:10Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.140919 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerStarted","Data":"67e393c93011e3c8b0ff2e2d7e0daa9803ba72f804e71d4d351354e971987fb1"} Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.141253 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.141297 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.141315 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.148810 4841 generic.go:334] "Generic (PLEG): container finished" podID="5388897d-03d2-4551-a457-515f576d4621" containerID="2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a" exitCode=0 Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.148866 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" event={"ID":"5388897d-03d2-4551-a457-515f576d4621","Type":"ContainerDied","Data":"2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a"} Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.158849 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.175691 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.177750 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.186537 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.187975 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.206595 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.220549 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.232220 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.250820 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.260975 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.273368 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.285605 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.309405 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e393c93011e3c8b0ff2e2d7e0daa9803ba72f804e71d4d351354e971987fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.321950 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.341864 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.353703 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.363506 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.374564 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.384761 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.399297 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.413525 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.424774 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.444257 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.460786 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.491594 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e393c93011e3c8b0ff2e2d7e0daa9803ba72f804e71d4d351354e971987fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.511126 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.527872 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.544987 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.564468 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.578236 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:11Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.776589 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.776735 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.776780 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.776901 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:14:19.776861532 +0000 UTC m=+142.506761763 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.776949 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.776976 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.776994 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.777018 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.777059 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:19.777037838 +0000 UTC m=+142.506938069 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.777062 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.777091 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.777131 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:19.77711901 +0000 UTC m=+142.507019231 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.777024 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.777195 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.777216 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.777315 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:19.777296106 +0000 UTC m=+142.507196337 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.777321 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.777366 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:19.777354178 +0000 UTC m=+142.507254399 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.908001 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs\") pod \"network-metrics-daemon-5t7sg\" (UID: \"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\") " pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.908174 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.908299 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs podName:ea07a392-2a1f-4bae-bb67-db7cd421c1e1 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:19.908245846 +0000 UTC m=+142.638146047 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs") pod "network-metrics-daemon-5t7sg" (UID: "ea07a392-2a1f-4bae-bb67-db7cd421c1e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.995047 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.995150 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.995252 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.995452 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.995765 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.996137 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:11 crc kubenswrapper[4841]: I0313 09:14:11.995931 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:11 crc kubenswrapper[4841]: E0313 09:14:11.996496 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:12 crc kubenswrapper[4841]: I0313 09:14:12.157108 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" event={"ID":"5388897d-03d2-4551-a457-515f576d4621","Type":"ContainerStarted","Data":"80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958"} Mar 13 09:14:12 crc kubenswrapper[4841]: I0313 09:14:12.174865 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:12Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:12 crc kubenswrapper[4841]: I0313 09:14:12.195975 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:12Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:12 crc kubenswrapper[4841]: I0313 09:14:12.216992 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:12Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:12 crc kubenswrapper[4841]: I0313 09:14:12.232658 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:12Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:12 crc kubenswrapper[4841]: I0313 09:14:12.251601 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:12Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:12 crc kubenswrapper[4841]: I0313 09:14:12.271390 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:12Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:12 crc kubenswrapper[4841]: I0313 09:14:12.286136 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:12Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:12 crc kubenswrapper[4841]: I0313 09:14:12.308023 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:12Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:12 crc kubenswrapper[4841]: I0313 09:14:12.321776 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:12Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:12 crc kubenswrapper[4841]: I0313 09:14:12.347147 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e393c93011e3c8b0ff2e2d7e0daa9803ba72f804e71d4d351354e971987fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:12Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:12 crc kubenswrapper[4841]: I0313 09:14:12.362651 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:12Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:12 crc kubenswrapper[4841]: I0313 09:14:12.384324 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:12Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:12 crc kubenswrapper[4841]: I0313 09:14:12.400308 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:12Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:12 crc kubenswrapper[4841]: I0313 09:14:12.428012 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:12Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:13 crc kubenswrapper[4841]: E0313 09:14:13.100719 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:14:13 crc kubenswrapper[4841]: I0313 09:14:13.998727 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:14 crc kubenswrapper[4841]: E0313 09:14:14.000782 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.002559 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:14 crc kubenswrapper[4841]: E0313 09:14:14.004329 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.007483 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:14 crc kubenswrapper[4841]: E0313 09:14:14.008545 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.008688 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:14 crc kubenswrapper[4841]: E0313 09:14:14.008749 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.024754 4841 scope.go:117] "RemoveContainer" containerID="b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.024916 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 09:14:14 crc kubenswrapper[4841]: E0313 09:14:14.025110 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.168042 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovnkube-controller/0.log" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.177420 4841 generic.go:334] "Generic (PLEG): container finished" podID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerID="67e393c93011e3c8b0ff2e2d7e0daa9803ba72f804e71d4d351354e971987fb1" exitCode=1 Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.178506 4841 scope.go:117] "RemoveContainer" containerID="b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec" Mar 13 09:14:14 crc kubenswrapper[4841]: E0313 09:14:14.178812 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.179107 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerDied","Data":"67e393c93011e3c8b0ff2e2d7e0daa9803ba72f804e71d4d351354e971987fb1"} Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.180444 4841 scope.go:117] "RemoveContainer" containerID="67e393c93011e3c8b0ff2e2d7e0daa9803ba72f804e71d4d351354e971987fb1" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.228567 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:14Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.245870 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:14Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.261672 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:14Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.273738 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:14Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.283920 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:14Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.296240 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:14Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.311942 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:14Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.324724 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:14Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.349780 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e393c93011e3c8b0ff2e2d7e0daa9803ba72f804e71d4d351354e971987fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e393c93011e3c8b0ff2e2d7e0daa9803ba72f804e71d4d351354e971987fb1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:13Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 09:14:13.509945 6785 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 09:14:13.509989 6785 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 09:14:13.510023 6785 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 09:14:13.510033 6785 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 09:14:13.510042 6785 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 09:14:13.510048 6785 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 09:14:13.510063 6785 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 09:14:13.510077 6785 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 09:14:13.510090 6785 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 09:14:13.510125 6785 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 09:14:13.510162 6785 factory.go:656] Stopping watch factory\\\\nI0313 09:14:13.510167 6785 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 09:14:13.510178 6785 ovnkube.go:599] Stopped ovnkube\\\\nI0313 09:14:13.510189 6785 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 09:14:13.510209 6785 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 09:14:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:14Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.363882 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:14Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.377799 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:14Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.390607 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:14Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.401669 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:14Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.412306 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:14Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:14 crc kubenswrapper[4841]: I0313 09:14:14.427728 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f667de-6ffe-4f8a-b98f-944f055847c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:13:54Z\\\",\\\"message\\\":\\\"tarting file observer\\\\nW0313 09:13:53.739468 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 09:13:53.739633 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 09:13:53.740630 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-58616/tls.crt::/tmp/serving-cert-58616/tls.key\\\\\\\"\\\\nI0313 09:13:54.290916 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 09:13:54.293830 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 09:13:54.293856 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 09:13:54.293883 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 09:13:54.293893 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 09:13:54.299401 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0313 09:13:54.299409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 09:13:54.299465 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 09:13:54.299505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 09:13:54.299514 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 09:13:54.299524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 09:13:54.300966 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:13:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:14Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.186953 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovnkube-controller/0.log" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.190328 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerStarted","Data":"7e5bf001cb8ad49e1b07b8c20abf64b8e8a238f96f53a8007589c76d281a40b4"} Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.191164 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.206993 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f667de-6ffe-4f8a-b98f-944f055847c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:13:54Z\\\",\\\"message\\\":\\\"tarting file observer\\\\nW0313 09:13:53.739468 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 09:13:53.739633 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 09:13:53.740630 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-58616/tls.crt::/tmp/serving-cert-58616/tls.key\\\\\\\"\\\\nI0313 09:13:54.290916 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 09:13:54.293830 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 09:13:54.293856 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 09:13:54.293883 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 09:13:54.293893 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 09:13:54.299401 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0313 09:13:54.299409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 09:13:54.299465 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 09:13:54.299505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 09:13:54.299514 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 09:13:54.299524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 09:13:54.300966 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:13:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:15Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.226077 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:15Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.243857 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:15Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.256379 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:15Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.266726 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:15Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.279650 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:15Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.291944 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:15Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.304727 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:15Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.324938 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:15Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.344023 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:15Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.352357 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:15Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.369791 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bf001cb8ad49e1b07b8c20abf64b8e8a238f96f53a8007589c76d281a40b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e393c93011e3c8b0ff2e2d7e0daa9803ba72f804e71d4d351354e971987fb1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:13Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 09:14:13.509945 6785 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 09:14:13.509989 6785 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 09:14:13.510023 6785 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 09:14:13.510033 6785 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 09:14:13.510042 6785 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 09:14:13.510048 6785 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 09:14:13.510063 6785 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 09:14:13.510077 6785 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 09:14:13.510090 6785 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 09:14:13.510125 6785 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 09:14:13.510162 6785 factory.go:656] Stopping watch factory\\\\nI0313 09:14:13.510167 6785 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 09:14:13.510178 6785 ovnkube.go:599] Stopped ovnkube\\\\nI0313 09:14:13.510189 6785 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 09:14:13.510209 6785 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 09:14:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:15Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.383960 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:15Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.394368 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:15Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.403902 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:15Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.994913 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.995023 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.994914 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:15 crc kubenswrapper[4841]: E0313 09:14:15.995103 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:15 crc kubenswrapper[4841]: I0313 09:14:15.995041 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:15 crc kubenswrapper[4841]: E0313 09:14:15.995227 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:15 crc kubenswrapper[4841]: E0313 09:14:15.995361 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:15 crc kubenswrapper[4841]: E0313 09:14:15.995493 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.196629 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovnkube-controller/1.log" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.198825 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovnkube-controller/0.log" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.202236 4841 generic.go:334] "Generic (PLEG): container finished" podID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerID="7e5bf001cb8ad49e1b07b8c20abf64b8e8a238f96f53a8007589c76d281a40b4" exitCode=1 Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.202292 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerDied","Data":"7e5bf001cb8ad49e1b07b8c20abf64b8e8a238f96f53a8007589c76d281a40b4"} Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.202323 4841 scope.go:117] "RemoveContainer" containerID="67e393c93011e3c8b0ff2e2d7e0daa9803ba72f804e71d4d351354e971987fb1" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.203985 4841 scope.go:117] "RemoveContainer" containerID="7e5bf001cb8ad49e1b07b8c20abf64b8e8a238f96f53a8007589c76d281a40b4" Mar 13 09:14:16 crc kubenswrapper[4841]: E0313 09:14:16.204515 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.226037 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.237465 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.260933 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bf001cb8ad49e1b07b8c20abf64b8e8a238f96f53a8007589c76d281a40b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e393c93011e3c8b0ff2e2d7e0daa9803ba72f804e71d4d351354e971987fb1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:13Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 09:14:13.509945 6785 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 09:14:13.509989 6785 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 09:14:13.510023 6785 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 09:14:13.510033 6785 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 09:14:13.510042 6785 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 09:14:13.510048 6785 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 09:14:13.510063 6785 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 09:14:13.510077 6785 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 09:14:13.510090 6785 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 09:14:13.510125 6785 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 09:14:13.510162 6785 factory.go:656] Stopping watch factory\\\\nI0313 09:14:13.510167 6785 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 09:14:13.510178 6785 ovnkube.go:599] Stopped ovnkube\\\\nI0313 09:14:13.510189 6785 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 09:14:13.510209 6785 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 09:14:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e5bf001cb8ad49e1b07b8c20abf64b8e8a238f96f53a8007589c76d281a40b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:15Z\\\",\\\"message\\\":\\\"9:14:15.091998 6944 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092151 6944 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092353 6944 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092413 6944 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092484 6944 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092556 6944 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.093217 6944 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 09:14:15.093306 6944 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 09:14:15.093342 6944 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 09:14:15.093364 6944 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 09:14:15.093376 6944 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 09:14:15.093388 6944 factory.go:656] Stopping watch factory\\\\nI0313 09:14:15.093411 6944 ovnkube.go:599] Stopped ovnkube\\\\nI0313 09:14:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.276166 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.289222 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.306009 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.324368 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.340495 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.361528 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f667de-6ffe-4f8a-b98f-944f055847c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:13:54Z\\\",\\\"message\\\":\\\"tarting file observer\\\\nW0313 09:13:53.739468 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 09:13:53.739633 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 09:13:53.740630 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-58616/tls.crt::/tmp/serving-cert-58616/tls.key\\\\\\\"\\\\nI0313 09:13:54.290916 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 09:13:54.293830 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 09:13:54.293856 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 09:13:54.293883 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 09:13:54.293893 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 09:13:54.299401 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0313 09:13:54.299409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 09:13:54.299465 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 09:13:54.299505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 09:13:54.299514 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 09:13:54.299524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 09:13:54.300966 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:13:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.378293 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.398526 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.416589 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.434700 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.434743 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.434757 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.434778 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.434794 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:16Z","lastTransitionTime":"2026-03-13T09:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.436884 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: E0313 09:14:16.449163 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.452589 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.452623 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.452636 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.452650 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.452661 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:16Z","lastTransitionTime":"2026-03-13T09:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.455610 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: E0313 09:14:16.466571 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.470049 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.470136 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.470410 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.470528 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.470638 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.470736 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:16Z","lastTransitionTime":"2026-03-13T09:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:16 crc kubenswrapper[4841]: E0313 09:14:16.481850 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.486062 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.486091 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.486104 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.486121 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.486133 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:16Z","lastTransitionTime":"2026-03-13T09:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:16 crc kubenswrapper[4841]: E0313 09:14:16.503002 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.506816 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.506940 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.507034 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.507120 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:16 crc kubenswrapper[4841]: I0313 09:14:16.507200 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:16Z","lastTransitionTime":"2026-03-13T09:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:16 crc kubenswrapper[4841]: E0313 09:14:16.520102 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:16Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:16 crc kubenswrapper[4841]: E0313 09:14:16.520501 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.209376 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovnkube-controller/1.log" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.213572 4841 scope.go:117] "RemoveContainer" containerID="7e5bf001cb8ad49e1b07b8c20abf64b8e8a238f96f53a8007589c76d281a40b4" Mar 13 09:14:17 crc kubenswrapper[4841]: E0313 09:14:17.213829 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.234559 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f667de-6ffe-4f8a-b98f-944f055847c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:13:54Z\\\",\\\"message\\\":\\\"tarting file observer\\\\nW0313 09:13:53.739468 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 09:13:53.739633 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 09:13:53.740630 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-58616/tls.crt::/tmp/serving-cert-58616/tls.key\\\\\\\"\\\\nI0313 09:13:54.290916 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 09:13:54.293830 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 09:13:54.293856 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 09:13:54.293883 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 09:13:54.293893 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 09:13:54.299401 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0313 09:13:54.299409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 09:13:54.299465 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 09:13:54.299505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 09:13:54.299514 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 09:13:54.299524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 09:13:54.300966 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:13:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:17Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.254392 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:17Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.272038 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:17Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.288981 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:17Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.307540 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:17Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.321784 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:17Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.341283 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:17Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.357945 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:17Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.374376 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:17Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.395708 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:17Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.414799 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:17Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.448967 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bf001cb8ad49e1b07b8c20abf64b8e8a238f96f53a8007589c76d281a40b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e5bf001cb8ad49e1b07b8c20abf64b8e8a238f96f53a8007589c76d281a40b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:15Z\\\",\\\"message\\\":\\\"9:14:15.091998 6944 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092151 6944 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092353 6944 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092413 6944 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092484 6944 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092556 6944 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.093217 6944 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 09:14:15.093306 6944 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 09:14:15.093342 6944 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 09:14:15.093364 6944 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 09:14:15.093376 6944 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 09:14:15.093388 6944 factory.go:656] Stopping watch factory\\\\nI0313 09:14:15.093411 6944 ovnkube.go:599] Stopped ovnkube\\\\nI0313 09:14:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:17Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.463789 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:17Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.481542 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:17Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.494530 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:17Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.994920 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.995064 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.995064 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:17 crc kubenswrapper[4841]: I0313 09:14:17.995240 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:17 crc kubenswrapper[4841]: E0313 09:14:17.995379 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:17 crc kubenswrapper[4841]: E0313 09:14:17.995247 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:17 crc kubenswrapper[4841]: E0313 09:14:17.995470 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:17 crc kubenswrapper[4841]: E0313 09:14:17.995538 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:18 crc kubenswrapper[4841]: I0313 09:14:18.018725 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:18Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:18 crc kubenswrapper[4841]: I0313 09:14:18.038128 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:18Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:18 crc kubenswrapper[4841]: I0313 09:14:18.060001 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:18Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:18 crc kubenswrapper[4841]: I0313 09:14:18.086026 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:18Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:18 crc kubenswrapper[4841]: E0313 09:14:18.101519 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:14:18 crc kubenswrapper[4841]: I0313 09:14:18.108668 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:18Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:18 crc kubenswrapper[4841]: I0313 09:14:18.125005 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:18Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:18 crc kubenswrapper[4841]: I0313 09:14:18.143542 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:18Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:18 crc kubenswrapper[4841]: I0313 09:14:18.160022 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:18Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:18 crc kubenswrapper[4841]: I0313 09:14:18.193555 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bf001cb8ad49e1b07b8c20abf64b8e8a238f96f53a8007589c76d281a40b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e5bf001cb8ad49e1b07b8c20abf64b8e8a238f96f53a8007589c76d281a40b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:15Z\\\",\\\"message\\\":\\\"9:14:15.091998 6944 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092151 6944 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092353 6944 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092413 6944 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092484 6944 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092556 6944 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.093217 6944 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 09:14:15.093306 6944 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 09:14:15.093342 6944 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 09:14:15.093364 6944 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 09:14:15.093376 6944 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 09:14:15.093388 6944 factory.go:656] Stopping watch factory\\\\nI0313 09:14:15.093411 6944 ovnkube.go:599] Stopped ovnkube\\\\nI0313 09:14:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:18Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:18 crc kubenswrapper[4841]: I0313 09:14:18.213805 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:18Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:18 crc kubenswrapper[4841]: I0313 09:14:18.237151 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:18Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:18 crc kubenswrapper[4841]: I0313 09:14:18.254310 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:18Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:18 crc kubenswrapper[4841]: I0313 09:14:18.269780 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:18Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:18 crc kubenswrapper[4841]: I0313 09:14:18.286105 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:18Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:18 crc kubenswrapper[4841]: I0313 09:14:18.308717 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f667de-6ffe-4f8a-b98f-944f055847c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:13:54Z\\\",\\\"message\\\":\\\"tarting file observer\\\\nW0313 09:13:53.739468 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 09:13:53.739633 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 09:13:53.740630 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-58616/tls.crt::/tmp/serving-cert-58616/tls.key\\\\\\\"\\\\nI0313 09:13:54.290916 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 09:13:54.293830 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 09:13:54.293856 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 09:13:54.293883 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 09:13:54.293893 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 09:13:54.299401 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0313 09:13:54.299409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 09:13:54.299465 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 09:13:54.299505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 09:13:54.299514 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 09:13:54.299524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 09:13:54.300966 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:13:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:18Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:19 crc kubenswrapper[4841]: I0313 09:14:19.785522 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.785691 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:14:35.785655282 +0000 UTC m=+158.515555533 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:14:19 crc kubenswrapper[4841]: I0313 09:14:19.786124 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:19 crc kubenswrapper[4841]: I0313 09:14:19.786202 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:19 crc kubenswrapper[4841]: I0313 09:14:19.786328 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:19 crc kubenswrapper[4841]: I0313 09:14:19.786392 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.786442 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.786520 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.786581 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:35.786541851 +0000 UTC m=+158.516442072 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.786593 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.786623 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:35.786604263 +0000 UTC m=+158.516504594 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.786632 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.786657 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.786725 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.786760 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.786785 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.786732 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:35.786709396 +0000 UTC m=+158.516609637 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.786882 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:35.786858021 +0000 UTC m=+158.516758352 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:19 crc kubenswrapper[4841]: I0313 09:14:19.988988 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs\") pod \"network-metrics-daemon-5t7sg\" (UID: \"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\") " pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.989200 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.989876 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs podName:ea07a392-2a1f-4bae-bb67-db7cd421c1e1 nodeName:}" failed. No retries permitted until 2026-03-13 09:14:35.989839266 +0000 UTC m=+158.719739497 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs") pod "network-metrics-daemon-5t7sg" (UID: "ea07a392-2a1f-4bae-bb67-db7cd421c1e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 09:14:19 crc kubenswrapper[4841]: I0313 09:14:19.994939 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:19 crc kubenswrapper[4841]: I0313 09:14:19.995204 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:19 crc kubenswrapper[4841]: I0313 09:14:19.995010 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:19 crc kubenswrapper[4841]: I0313 09:14:19.995065 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.995846 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.996074 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.996230 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:19 crc kubenswrapper[4841]: E0313 09:14:19.996082 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:21 crc kubenswrapper[4841]: I0313 09:14:21.994722 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:21 crc kubenswrapper[4841]: E0313 09:14:21.994884 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:21 crc kubenswrapper[4841]: I0313 09:14:21.995429 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:21 crc kubenswrapper[4841]: E0313 09:14:21.995605 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:21 crc kubenswrapper[4841]: I0313 09:14:21.995644 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:21 crc kubenswrapper[4841]: I0313 09:14:21.995699 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:21 crc kubenswrapper[4841]: E0313 09:14:21.995869 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:21 crc kubenswrapper[4841]: E0313 09:14:21.996152 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:23 crc kubenswrapper[4841]: E0313 09:14:23.102982 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:14:23 crc kubenswrapper[4841]: I0313 09:14:23.994436 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:23 crc kubenswrapper[4841]: I0313 09:14:23.994519 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:23 crc kubenswrapper[4841]: I0313 09:14:23.994579 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:23 crc kubenswrapper[4841]: E0313 09:14:23.994676 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:23 crc kubenswrapper[4841]: I0313 09:14:23.994705 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:23 crc kubenswrapper[4841]: E0313 09:14:23.994846 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:23 crc kubenswrapper[4841]: E0313 09:14:23.994984 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:23 crc kubenswrapper[4841]: E0313 09:14:23.995330 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:24 crc kubenswrapper[4841]: I0313 09:14:24.995459 4841 scope.go:117] "RemoveContainer" containerID="b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec" Mar 13 09:14:24 crc kubenswrapper[4841]: E0313 09:14:24.995726 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:14:25 crc kubenswrapper[4841]: I0313 09:14:25.994941 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:25 crc kubenswrapper[4841]: I0313 09:14:25.995007 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:25 crc kubenswrapper[4841]: I0313 09:14:25.995126 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:25 crc kubenswrapper[4841]: E0313 09:14:25.995131 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:25 crc kubenswrapper[4841]: I0313 09:14:25.995193 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:25 crc kubenswrapper[4841]: E0313 09:14:25.995376 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:25 crc kubenswrapper[4841]: E0313 09:14:25.995503 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:25 crc kubenswrapper[4841]: E0313 09:14:25.995631 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.700260 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.700363 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.700379 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.700407 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.700425 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:26Z","lastTransitionTime":"2026-03-13T09:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:26 crc kubenswrapper[4841]: E0313 09:14:26.719017 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:26Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.724606 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.724877 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.725021 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.725178 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.725352 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:26Z","lastTransitionTime":"2026-03-13T09:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:26 crc kubenswrapper[4841]: E0313 09:14:26.747147 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:26Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.755472 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.755851 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.756015 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.756186 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.756362 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:26Z","lastTransitionTime":"2026-03-13T09:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:26 crc kubenswrapper[4841]: E0313 09:14:26.777676 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:26Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.781811 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.781847 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.781858 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.781876 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.781889 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:26Z","lastTransitionTime":"2026-03-13T09:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:26 crc kubenswrapper[4841]: E0313 09:14:26.802189 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:26Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.807659 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.807700 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.807719 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.807742 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:26 crc kubenswrapper[4841]: I0313 09:14:26.807759 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:26Z","lastTransitionTime":"2026-03-13T09:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:26 crc kubenswrapper[4841]: E0313 09:14:26.827867 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:26Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:26 crc kubenswrapper[4841]: E0313 09:14:26.827976 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 09:14:27 crc kubenswrapper[4841]: I0313 09:14:27.994370 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:27 crc kubenswrapper[4841]: I0313 09:14:27.994397 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:27 crc kubenswrapper[4841]: I0313 09:14:27.994506 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:27 crc kubenswrapper[4841]: I0313 09:14:27.994576 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:27 crc kubenswrapper[4841]: E0313 09:14:27.994570 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:27 crc kubenswrapper[4841]: E0313 09:14:27.994656 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:27 crc kubenswrapper[4841]: E0313 09:14:27.994863 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:27 crc kubenswrapper[4841]: E0313 09:14:27.995076 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:28 crc kubenswrapper[4841]: I0313 09:14:28.017941 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f667de-6ffe-4f8a-b98f-944f055847c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:13:54Z\\\",\\\"message\\\":\\\"tarting file observer\\\\nW0313 09:13:53.739468 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 09:13:53.739633 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 09:13:53.740630 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-58616/tls.crt::/tmp/serving-cert-58616/tls.key\\\\\\\"\\\\nI0313 09:13:54.290916 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 09:13:54.293830 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 09:13:54.293856 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 09:13:54.293883 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 09:13:54.293893 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 09:13:54.299401 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0313 09:13:54.299409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 09:13:54.299465 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 09:13:54.299505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 09:13:54.299514 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 09:13:54.299524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 09:13:54.300966 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:13:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:28Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:28 crc kubenswrapper[4841]: I0313 09:14:28.032225 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:28Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:28 crc kubenswrapper[4841]: I0313 09:14:28.048956 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:28Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:28 crc kubenswrapper[4841]: I0313 09:14:28.065678 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:28Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:28 crc kubenswrapper[4841]: I0313 09:14:28.085194 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:28Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:28 crc kubenswrapper[4841]: E0313 09:14:28.103876 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:14:28 crc kubenswrapper[4841]: I0313 09:14:28.106362 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:28Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:28 crc kubenswrapper[4841]: I0313 09:14:28.125305 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:28Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:28 crc kubenswrapper[4841]: I0313 09:14:28.140132 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:28Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:28 crc kubenswrapper[4841]: I0313 09:14:28.155499 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:28Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:28 crc kubenswrapper[4841]: I0313 09:14:28.183480 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e5bf001cb8ad49e1b07b8c20abf64b8e8a238f96f53a8007589c76d281a40b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e5bf001cb8ad49e1b07b8c20abf64b8e8a238f96f53a8007589c76d281a40b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:15Z\\\",\\\"message\\\":\\\"9:14:15.091998 6944 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092151 6944 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092353 6944 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092413 6944 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092484 6944 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092556 6944 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.093217 6944 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 09:14:15.093306 6944 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 09:14:15.093342 6944 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 09:14:15.093364 6944 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 09:14:15.093376 6944 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 09:14:15.093388 6944 factory.go:656] Stopping watch factory\\\\nI0313 09:14:15.093411 6944 ovnkube.go:599] Stopped ovnkube\\\\nI0313 09:14:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:28Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:28 crc kubenswrapper[4841]: I0313 09:14:28.205900 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:28Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:28 crc kubenswrapper[4841]: I0313 09:14:28.217800 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:28Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:28 crc kubenswrapper[4841]: I0313 09:14:28.233145 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:28Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:28 crc kubenswrapper[4841]: I0313 09:14:28.252510 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:28Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:28 crc kubenswrapper[4841]: I0313 09:14:28.276231 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:28Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:29 crc kubenswrapper[4841]: I0313 09:14:29.994698 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:29 crc kubenswrapper[4841]: I0313 09:14:29.994735 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:29 crc kubenswrapper[4841]: I0313 09:14:29.994837 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:29 crc kubenswrapper[4841]: I0313 09:14:29.994924 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:29 crc kubenswrapper[4841]: E0313 09:14:29.995221 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:29 crc kubenswrapper[4841]: E0313 09:14:29.995260 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:29 crc kubenswrapper[4841]: E0313 09:14:29.995454 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:29 crc kubenswrapper[4841]: E0313 09:14:29.995773 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:30 crc kubenswrapper[4841]: I0313 09:14:30.996543 4841 scope.go:117] "RemoveContainer" containerID="7e5bf001cb8ad49e1b07b8c20abf64b8e8a238f96f53a8007589c76d281a40b4" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.010982 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.271897 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovnkube-controller/1.log" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.275659 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerStarted","Data":"1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8"} Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.276419 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.290118 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:31Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.301971 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f690f9-80ff-4833-a26b-bf70cd065ddd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bb5d2ddd18b198fb4c473b94295b9755e14c4b010c63cb78bd3cec1c51c812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b48a1313ef08ab3065fd803efd4e71b468dc9dc0c887af4c705397d07bc3444f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:12:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 09:12:24.486187 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 09:12:24.487491 1 observer_polling.go:159] Starting file observer\\\\nI0313 09:12:24.488994 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 09:12:24.490009 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 09:12:54.089987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 09:12:54.090130 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:53Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:24Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://835d958e1267f01e98aac57897a03c29b574dc64de5080416c896cd2f52430be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16119605de1f62b8d63a541d54d65d0b9ece2e285c6012dcf5a56b8aee75482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7408253e38e2ed73ba8483cc0114f0e210924d045466866931809cfe30deb850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:31Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.317510 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:31Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.335300 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:31Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.349933 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:31Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.365957 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:31Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.377898 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:31Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.402739 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:31Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.415164 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:31Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.436502 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e5bf001cb8ad49e1b07b8c20abf64b8e8a238f96f53a8007589c76d281a40b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:15Z\\\",\\\"message\\\":\\\"9:14:15.091998 6944 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092151 6944 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092353 6944 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092413 6944 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092484 6944 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092556 6944 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.093217 6944 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 09:14:15.093306 6944 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 09:14:15.093342 6944 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 09:14:15.093364 6944 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 09:14:15.093376 6944 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 09:14:15.093388 6944 factory.go:656] Stopping watch factory\\\\nI0313 09:14:15.093411 6944 ovnkube.go:599] Stopped ovnkube\\\\nI0313 09:14:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:31Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.450931 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:31Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.472933 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:31Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.491351 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:31Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.509386 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f667de-6ffe-4f8a-b98f-944f055847c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:13:54Z\\\",\\\"message\\\":\\\"tarting file observer\\\\nW0313 09:13:53.739468 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 09:13:53.739633 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 09:13:53.740630 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-58616/tls.crt::/tmp/serving-cert-58616/tls.key\\\\\\\"\\\\nI0313 09:13:54.290916 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 09:13:54.293830 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 09:13:54.293856 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 09:13:54.293883 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 09:13:54.293893 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 09:13:54.299401 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0313 09:13:54.299409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 09:13:54.299465 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 09:13:54.299505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 09:13:54.299514 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 09:13:54.299524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 09:13:54.300966 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:13:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:31Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.533209 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:31Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.549666 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:31Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.994438 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:31 crc kubenswrapper[4841]: E0313 09:14:31.994757 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.994507 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:31 crc kubenswrapper[4841]: E0313 09:14:31.994986 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.994500 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:31 crc kubenswrapper[4841]: E0313 09:14:31.995277 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:31 crc kubenswrapper[4841]: I0313 09:14:31.994508 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:31 crc kubenswrapper[4841]: E0313 09:14:31.995493 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.280801 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovnkube-controller/2.log" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.282119 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovnkube-controller/1.log" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.284653 4841 generic.go:334] "Generic (PLEG): container finished" podID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerID="1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8" exitCode=1 Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.284685 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerDied","Data":"1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8"} Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.284741 4841 scope.go:117] "RemoveContainer" containerID="7e5bf001cb8ad49e1b07b8c20abf64b8e8a238f96f53a8007589c76d281a40b4" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.285764 4841 scope.go:117] "RemoveContainer" containerID="1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8" Mar 13 09:14:32 crc kubenswrapper[4841]: E0313 09:14:32.286602 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.304293 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:32Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.318374 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:32Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.334721 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:32Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.344675 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:32Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.354320 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:32Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.364930 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f690f9-80ff-4833-a26b-bf70cd065ddd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bb5d2ddd18b198fb4c473b94295b9755e14c4b010c63cb78bd3cec1c51c812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b48a1313ef08ab3065fd803efd4e71b468dc9dc0c887af4c705397d07bc3444f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:12:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 09:12:24.486187 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 09:12:24.487491 1 observer_polling.go:159] Starting file observer\\\\nI0313 09:12:24.488994 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 09:12:24.490009 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 09:12:54.089987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 09:12:54.090130 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:53Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:24Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://835d958e1267f01e98aac57897a03c29b574dc64de5080416c896cd2f52430be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16119605de1f62b8d63a541d54d65d0b9ece2e285c6012dcf5a56b8aee75482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7408253e38e2ed73ba8483cc0114f0e210924d045466866931809cfe30deb850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:32Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.375306 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:32Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.393234 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e5bf001cb8ad49e1b07b8c20abf64b8e8a238f96f53a8007589c76d281a40b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:15Z\\\",\\\"message\\\":\\\"9:14:15.091998 6944 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092151 6944 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092353 6944 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092413 6944 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092484 6944 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.092556 6944 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:15.093217 6944 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 09:14:15.093306 6944 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 09:14:15.093342 6944 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 09:14:15.093364 6944 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 09:14:15.093376 6944 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 09:14:15.093388 6944 factory.go:656] Stopping watch factory\\\\nI0313 09:14:15.093411 6944 ovnkube.go:599] Stopped ovnkube\\\\nI0313 09:14:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:32Z\\\",\\\"message\\\":\\\"rs/factory.go:160\\\\nI0313 09:14:32.034554 7145 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034701 7145 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034847 7145 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034978 7145 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 09:14:32.035103 7145 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0313 09:14:32.035176 7145 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 09:14:32.035937 7145 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 09:14:32.035968 7145 factory.go:656] Stopping watch factory\\\\nI0313 09:14:32.035980 7145 ovnkube.go:599] Stopped ovnkube\\\\nI0313 09:14:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:32Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.420625 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:32Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.435046 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:32Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.446615 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:32Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.461571 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:32Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.474672 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:32Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.489525 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f667de-6ffe-4f8a-b98f-944f055847c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:13:54Z\\\",\\\"message\\\":\\\"tarting file observer\\\\nW0313 09:13:53.739468 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 09:13:53.739633 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 09:13:53.740630 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-58616/tls.crt::/tmp/serving-cert-58616/tls.key\\\\\\\"\\\\nI0313 09:13:54.290916 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 09:13:54.293830 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 09:13:54.293856 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 09:13:54.293883 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 09:13:54.293893 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 09:13:54.299401 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0313 09:13:54.299409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 09:13:54.299465 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 09:13:54.299505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 09:13:54.299514 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 09:13:54.299524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 09:13:54.300966 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:13:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:32Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.503427 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:32Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:32 crc kubenswrapper[4841]: I0313 09:14:32.513428 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:32Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:33 crc kubenswrapper[4841]: E0313 09:14:33.105127 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.289393 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovnkube-controller/2.log" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.293632 4841 scope.go:117] "RemoveContainer" containerID="1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8" Mar 13 09:14:33 crc kubenswrapper[4841]: E0313 09:14:33.293788 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.307533 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:33Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.319327 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:33Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.330961 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:33Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.345132 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:33Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.360432 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f667de-6ffe-4f8a-b98f-944f055847c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:13:54Z\\\",\\\"message\\\":\\\"tarting file observer\\\\nW0313 09:13:53.739468 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 09:13:53.739633 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 09:13:53.740630 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-58616/tls.crt::/tmp/serving-cert-58616/tls.key\\\\\\\"\\\\nI0313 09:13:54.290916 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 09:13:54.293830 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 09:13:54.293856 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 09:13:54.293883 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 09:13:54.293893 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 09:13:54.299401 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0313 09:13:54.299409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 09:13:54.299465 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 09:13:54.299505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 09:13:54.299514 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 09:13:54.299524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 09:13:54.300966 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:13:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:33Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.376003 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:33Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.389855 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:33Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.406461 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:33Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.421108 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:33Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.437525 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:33Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.452249 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:33Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.467202 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:33Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.484491 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f690f9-80ff-4833-a26b-bf70cd065ddd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bb5d2ddd18b198fb4c473b94295b9755e14c4b010c63cb78bd3cec1c51c812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b48a1313ef08ab3065fd803efd4e71b468dc9dc0c887af4c705397d07bc3444f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:12:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 09:12:24.486187 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 09:12:24.487491 1 observer_polling.go:159] Starting file observer\\\\nI0313 09:12:24.488994 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 09:12:24.490009 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 09:12:54.089987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 09:12:54.090130 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:53Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:24Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://835d958e1267f01e98aac57897a03c29b574dc64de5080416c896cd2f52430be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16119605de1f62b8d63a541d54d65d0b9ece2e285c6012dcf5a56b8aee75482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7408253e38e2ed73ba8483cc0114f0e210924d045466866931809cfe30deb850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:33Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.499209 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:33Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.528234 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:32Z\\\",\\\"message\\\":\\\"rs/factory.go:160\\\\nI0313 09:14:32.034554 7145 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034701 7145 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034847 7145 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034978 7145 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 09:14:32.035103 7145 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0313 09:14:32.035176 7145 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 09:14:32.035937 7145 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 09:14:32.035968 7145 factory.go:656] Stopping watch factory\\\\nI0313 09:14:32.035980 7145 ovnkube.go:599] Stopped ovnkube\\\\nI0313 09:14:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:33Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.546826 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:33Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.994503 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.994510 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.994613 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:33 crc kubenswrapper[4841]: I0313 09:14:33.994641 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:33 crc kubenswrapper[4841]: E0313 09:14:33.994783 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:33 crc kubenswrapper[4841]: E0313 09:14:33.994955 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:33 crc kubenswrapper[4841]: E0313 09:14:33.995129 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:33 crc kubenswrapper[4841]: E0313 09:14:33.995406 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:35 crc kubenswrapper[4841]: I0313 09:14:35.006412 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 13 09:14:35 crc kubenswrapper[4841]: I0313 09:14:35.858189 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:14:35 crc kubenswrapper[4841]: E0313 09:14:35.858429 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:15:07.858394199 +0000 UTC m=+190.588294430 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:14:35 crc kubenswrapper[4841]: I0313 09:14:35.858542 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:35 crc kubenswrapper[4841]: I0313 09:14:35.858574 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:35 crc kubenswrapper[4841]: I0313 09:14:35.858600 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:35 crc kubenswrapper[4841]: I0313 09:14:35.858617 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:35 crc kubenswrapper[4841]: E0313 09:14:35.858696 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 09:14:35 crc kubenswrapper[4841]: E0313 09:14:35.858707 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 09:14:35 crc kubenswrapper[4841]: E0313 09:14:35.858740 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 09:14:35 crc kubenswrapper[4841]: E0313 09:14:35.858753 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 09:14:35 crc kubenswrapper[4841]: E0313 09:14:35.858761 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 09:15:07.858747371 +0000 UTC m=+190.588647562 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 09:14:35 crc kubenswrapper[4841]: E0313 09:14:35.858763 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:35 crc kubenswrapper[4841]: E0313 09:14:35.858800 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 09:15:07.858775091 +0000 UTC m=+190.588675322 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 09:14:35 crc kubenswrapper[4841]: E0313 09:14:35.858836 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 09:15:07.858819753 +0000 UTC m=+190.588720024 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:35 crc kubenswrapper[4841]: E0313 09:14:35.858871 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 09:14:35 crc kubenswrapper[4841]: E0313 09:14:35.858913 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 09:14:35 crc kubenswrapper[4841]: E0313 09:14:35.858937 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:35 crc kubenswrapper[4841]: E0313 09:14:35.859024 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 09:15:07.858998979 +0000 UTC m=+190.588899210 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:14:35 crc kubenswrapper[4841]: I0313 09:14:35.994553 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:35 crc kubenswrapper[4841]: I0313 09:14:35.994633 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:35 crc kubenswrapper[4841]: I0313 09:14:35.994554 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:35 crc kubenswrapper[4841]: E0313 09:14:35.994757 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:35 crc kubenswrapper[4841]: I0313 09:14:35.994591 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:35 crc kubenswrapper[4841]: E0313 09:14:35.994931 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:35 crc kubenswrapper[4841]: E0313 09:14:35.995063 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:35 crc kubenswrapper[4841]: E0313 09:14:35.995224 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:36 crc kubenswrapper[4841]: I0313 09:14:36.061256 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs\") pod \"network-metrics-daemon-5t7sg\" (UID: \"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\") " pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:36 crc kubenswrapper[4841]: E0313 09:14:36.061569 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 09:14:36 crc kubenswrapper[4841]: E0313 09:14:36.061681 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs podName:ea07a392-2a1f-4bae-bb67-db7cd421c1e1 nodeName:}" failed. No retries permitted until 2026-03-13 09:15:08.061652954 +0000 UTC m=+190.791553185 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs") pod "network-metrics-daemon-5t7sg" (UID: "ea07a392-2a1f-4bae-bb67-db7cd421c1e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.004307 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.004348 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.004387 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.004402 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.004412 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:37Z","lastTransitionTime":"2026-03-13T09:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:37 crc kubenswrapper[4841]: E0313 09:14:37.022559 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:37Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.027494 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.027536 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.027553 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.027575 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.027591 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:37Z","lastTransitionTime":"2026-03-13T09:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:37 crc kubenswrapper[4841]: E0313 09:14:37.042101 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:37Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.046830 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.046892 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.046911 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.046932 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.046947 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:37Z","lastTransitionTime":"2026-03-13T09:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:37 crc kubenswrapper[4841]: E0313 09:14:37.071494 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:37Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.076842 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.076882 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.076896 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.076916 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.076931 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:37Z","lastTransitionTime":"2026-03-13T09:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:37 crc kubenswrapper[4841]: E0313 09:14:37.090577 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:37Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.094557 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.094598 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.094614 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.094632 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.094646 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:37Z","lastTransitionTime":"2026-03-13T09:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:37 crc kubenswrapper[4841]: E0313 09:14:37.107418 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:37Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:37 crc kubenswrapper[4841]: E0313 09:14:37.107529 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.994095 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.994231 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:37 crc kubenswrapper[4841]: E0313 09:14:37.994588 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.994612 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:37 crc kubenswrapper[4841]: I0313 09:14:37.994373 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:37 crc kubenswrapper[4841]: E0313 09:14:37.994729 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:37 crc kubenswrapper[4841]: E0313 09:14:37.994887 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:37 crc kubenswrapper[4841]: E0313 09:14:37.995085 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:38 crc kubenswrapper[4841]: I0313 09:14:38.018300 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f667de-6ffe-4f8a-b98f-944f055847c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:13:54Z\\\",\\\"message\\\":\\\"tarting file observer\\\\nW0313 09:13:53.739468 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 09:13:53.739633 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 09:13:53.740630 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-58616/tls.crt::/tmp/serving-cert-58616/tls.key\\\\\\\"\\\\nI0313 09:13:54.290916 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 09:13:54.293830 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 09:13:54.293856 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 09:13:54.293883 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 09:13:54.293893 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 09:13:54.299401 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0313 09:13:54.299409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 09:13:54.299465 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 09:13:54.299505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 09:13:54.299514 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 09:13:54.299524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 09:13:54.300966 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:13:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:38Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:38 crc kubenswrapper[4841]: I0313 09:14:38.038444 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:38Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:38 crc kubenswrapper[4841]: I0313 09:14:38.057883 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:38Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:38 crc kubenswrapper[4841]: I0313 09:14:38.074025 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:38Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:38 crc kubenswrapper[4841]: I0313 09:14:38.095288 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f690f9-80ff-4833-a26b-bf70cd065ddd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bb5d2ddd18b198fb4c473b94295b9755e14c4b010c63cb78bd3cec1c51c812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b48a1313ef08ab3065fd803efd4e71b468dc9dc0c887af4c705397d07bc3444f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:12:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 09:12:24.486187 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 09:12:24.487491 1 observer_polling.go:159] Starting file observer\\\\nI0313 09:12:24.488994 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 09:12:24.490009 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 09:12:54.089987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 09:12:54.090130 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:53Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:24Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://835d958e1267f01e98aac57897a03c29b574dc64de5080416c896cd2f52430be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16119605de1f62b8d63a541d54d65d0b9ece2e285c6012dcf5a56b8aee75482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7408253e38e2ed73ba8483cc0114f0e210924d045466866931809cfe30deb850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:38Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:38 crc kubenswrapper[4841]: E0313 09:14:38.105825 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:14:38 crc kubenswrapper[4841]: I0313 09:14:38.120992 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:38Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:38 crc kubenswrapper[4841]: I0313 09:14:38.139134 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:38Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:38 crc kubenswrapper[4841]: I0313 09:14:38.157806 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:38Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:38 crc kubenswrapper[4841]: I0313 09:14:38.178739 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:38Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:38 crc kubenswrapper[4841]: I0313 09:14:38.194548 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:38Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:38 crc kubenswrapper[4841]: I0313 09:14:38.211500 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97dccbbf-4ff7-485f-a83d-26b01089e803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd735570021970d7d0266d4fb83355f1756bc3f8865ccfef1cd9af56e4a27b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:38Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:38 crc kubenswrapper[4841]: I0313 09:14:38.241627 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:38Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:38 crc kubenswrapper[4841]: I0313 09:14:38.256032 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:38Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:38 crc kubenswrapper[4841]: I0313 09:14:38.286179 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:32Z\\\",\\\"message\\\":\\\"rs/factory.go:160\\\\nI0313 09:14:32.034554 7145 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034701 7145 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034847 7145 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034978 7145 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 09:14:32.035103 7145 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0313 09:14:32.035176 7145 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 09:14:32.035937 7145 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 09:14:32.035968 7145 factory.go:656] Stopping watch factory\\\\nI0313 09:14:32.035980 7145 ovnkube.go:599] Stopped ovnkube\\\\nI0313 09:14:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:38Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:38 crc kubenswrapper[4841]: I0313 09:14:38.305293 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:38Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:38 crc kubenswrapper[4841]: I0313 09:14:38.321802 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:38Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:38 crc kubenswrapper[4841]: I0313 09:14:38.336863 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:38Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:38 crc kubenswrapper[4841]: I0313 09:14:38.994930 4841 scope.go:117] "RemoveContainer" containerID="b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec" Mar 13 09:14:38 crc kubenswrapper[4841]: E0313 09:14:38.995407 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:14:39 crc kubenswrapper[4841]: I0313 09:14:39.994674 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:39 crc kubenswrapper[4841]: I0313 09:14:39.994718 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:39 crc kubenswrapper[4841]: I0313 09:14:39.994751 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:39 crc kubenswrapper[4841]: E0313 09:14:39.994891 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:39 crc kubenswrapper[4841]: E0313 09:14:39.994944 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:39 crc kubenswrapper[4841]: E0313 09:14:39.995010 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:39 crc kubenswrapper[4841]: I0313 09:14:39.995558 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:39 crc kubenswrapper[4841]: E0313 09:14:39.995751 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:41 crc kubenswrapper[4841]: I0313 09:14:41.994854 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:41 crc kubenswrapper[4841]: I0313 09:14:41.994926 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:41 crc kubenswrapper[4841]: E0313 09:14:41.995004 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:41 crc kubenswrapper[4841]: E0313 09:14:41.995163 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:41 crc kubenswrapper[4841]: I0313 09:14:41.994854 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:41 crc kubenswrapper[4841]: E0313 09:14:41.995361 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:41 crc kubenswrapper[4841]: I0313 09:14:41.995433 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:41 crc kubenswrapper[4841]: E0313 09:14:41.995518 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:43 crc kubenswrapper[4841]: E0313 09:14:43.107441 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:14:43 crc kubenswrapper[4841]: I0313 09:14:43.994509 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:43 crc kubenswrapper[4841]: I0313 09:14:43.994573 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:43 crc kubenswrapper[4841]: I0313 09:14:43.994602 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:43 crc kubenswrapper[4841]: E0313 09:14:43.995129 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:43 crc kubenswrapper[4841]: I0313 09:14:43.994776 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:43 crc kubenswrapper[4841]: E0313 09:14:43.995220 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:43 crc kubenswrapper[4841]: E0313 09:14:43.995305 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:43 crc kubenswrapper[4841]: E0313 09:14:43.995506 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:45 crc kubenswrapper[4841]: I0313 09:14:45.994336 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:45 crc kubenswrapper[4841]: I0313 09:14:45.994405 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:45 crc kubenswrapper[4841]: I0313 09:14:45.994416 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:45 crc kubenswrapper[4841]: E0313 09:14:45.994579 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:45 crc kubenswrapper[4841]: I0313 09:14:45.994616 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:45 crc kubenswrapper[4841]: E0313 09:14:45.994782 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:45 crc kubenswrapper[4841]: E0313 09:14:45.994995 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:45 crc kubenswrapper[4841]: E0313 09:14:45.995094 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:46 crc kubenswrapper[4841]: I0313 09:14:46.995042 4841 scope.go:117] "RemoveContainer" containerID="1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8" Mar 13 09:14:46 crc kubenswrapper[4841]: E0313 09:14:46.995319 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.373862 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.373933 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.373952 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.373977 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.373996 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:47Z","lastTransitionTime":"2026-03-13T09:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:47 crc kubenswrapper[4841]: E0313 09:14:47.396021 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:47Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.401306 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.401357 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.401374 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.401394 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.401409 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:47Z","lastTransitionTime":"2026-03-13T09:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:47 crc kubenswrapper[4841]: E0313 09:14:47.417497 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:47Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.429889 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.429954 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.429976 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.430002 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.430022 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:47Z","lastTransitionTime":"2026-03-13T09:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:47 crc kubenswrapper[4841]: E0313 09:14:47.451969 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:47Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.456879 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.456902 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.456910 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.456923 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.456931 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:47Z","lastTransitionTime":"2026-03-13T09:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:47 crc kubenswrapper[4841]: E0313 09:14:47.471401 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:47Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.475127 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.475412 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.475575 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.475823 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.476049 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:47Z","lastTransitionTime":"2026-03-13T09:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:47 crc kubenswrapper[4841]: E0313 09:14:47.489891 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:47Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:47 crc kubenswrapper[4841]: E0313 09:14:47.490008 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.994033 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.994150 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.994082 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:47 crc kubenswrapper[4841]: E0313 09:14:47.994615 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:47 crc kubenswrapper[4841]: E0313 09:14:47.994792 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:47 crc kubenswrapper[4841]: E0313 09:14:47.994861 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:47 crc kubenswrapper[4841]: I0313 09:14:47.995152 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:47 crc kubenswrapper[4841]: E0313 09:14:47.995979 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:48 crc kubenswrapper[4841]: I0313 09:14:48.018317 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97dccbbf-4ff7-485f-a83d-26b01089e803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd735570021970d7d0266d4fb83355f1756bc3f8865ccfef1cd9af56e4a27b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:48Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:48 crc kubenswrapper[4841]: I0313 09:14:48.045128 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:48Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:48 crc kubenswrapper[4841]: I0313 09:14:48.062344 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:48Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:48 crc kubenswrapper[4841]: I0313 09:14:48.095801 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:32Z\\\",\\\"message\\\":\\\"rs/factory.go:160\\\\nI0313 09:14:32.034554 7145 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034701 7145 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034847 7145 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034978 7145 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 09:14:32.035103 7145 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0313 09:14:32.035176 7145 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 09:14:32.035937 7145 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 09:14:32.035968 7145 factory.go:656] Stopping watch factory\\\\nI0313 09:14:32.035980 7145 ovnkube.go:599] Stopped ovnkube\\\\nI0313 09:14:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:48Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:48 crc kubenswrapper[4841]: E0313 09:14:48.108075 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:14:48 crc kubenswrapper[4841]: I0313 09:14:48.117402 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:48Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:48 crc kubenswrapper[4841]: I0313 09:14:48.132908 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:48Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:48 crc kubenswrapper[4841]: I0313 09:14:48.147452 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:48Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:48 crc kubenswrapper[4841]: I0313 09:14:48.163509 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f667de-6ffe-4f8a-b98f-944f055847c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:13:54Z\\\",\\\"message\\\":\\\"tarting file observer\\\\nW0313 09:13:53.739468 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 09:13:53.739633 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 09:13:53.740630 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-58616/tls.crt::/tmp/serving-cert-58616/tls.key\\\\\\\"\\\\nI0313 09:13:54.290916 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 09:13:54.293830 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 09:13:54.293856 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 09:13:54.293883 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 09:13:54.293893 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 09:13:54.299401 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0313 09:13:54.299409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 09:13:54.299465 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 09:13:54.299505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 09:13:54.299514 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 09:13:54.299524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 09:13:54.300966 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:13:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:48Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:48 crc kubenswrapper[4841]: I0313 09:14:48.187568 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:48Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:48 crc kubenswrapper[4841]: I0313 09:14:48.203375 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:48Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:48 crc kubenswrapper[4841]: I0313 09:14:48.223635 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:48Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:48 crc kubenswrapper[4841]: I0313 09:14:48.244348 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:48Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:48 crc kubenswrapper[4841]: I0313 09:14:48.262181 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:48Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:48 crc kubenswrapper[4841]: I0313 09:14:48.278243 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:48Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:48 crc kubenswrapper[4841]: I0313 09:14:48.297498 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f690f9-80ff-4833-a26b-bf70cd065ddd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bb5d2ddd18b198fb4c473b94295b9755e14c4b010c63cb78bd3cec1c51c812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b48a1313ef08ab3065fd803efd4e71b468dc9dc0c887af4c705397d07bc3444f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:12:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 09:12:24.486187 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 09:12:24.487491 1 observer_polling.go:159] Starting file observer\\\\nI0313 09:12:24.488994 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 09:12:24.490009 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 09:12:54.089987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 09:12:54.090130 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:53Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:24Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://835d958e1267f01e98aac57897a03c29b574dc64de5080416c896cd2f52430be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16119605de1f62b8d63a541d54d65d0b9ece2e285c6012dcf5a56b8aee75482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7408253e38e2ed73ba8483cc0114f0e210924d045466866931809cfe30deb850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:48Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:48 crc kubenswrapper[4841]: I0313 09:14:48.320518 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:48Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:48 crc kubenswrapper[4841]: I0313 09:14:48.336125 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:48Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:49 crc kubenswrapper[4841]: I0313 09:14:49.994622 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:49 crc kubenswrapper[4841]: I0313 09:14:49.994776 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:49 crc kubenswrapper[4841]: E0313 09:14:49.994871 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:49 crc kubenswrapper[4841]: I0313 09:14:49.994971 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:49 crc kubenswrapper[4841]: I0313 09:14:49.995047 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:49 crc kubenswrapper[4841]: E0313 09:14:49.995101 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:49 crc kubenswrapper[4841]: E0313 09:14:49.995230 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:49 crc kubenswrapper[4841]: E0313 09:14:49.995331 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:51 crc kubenswrapper[4841]: I0313 09:14:51.994666 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:51 crc kubenswrapper[4841]: E0313 09:14:51.994819 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:51 crc kubenswrapper[4841]: I0313 09:14:51.995031 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:51 crc kubenswrapper[4841]: I0313 09:14:51.995109 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:51 crc kubenswrapper[4841]: E0313 09:14:51.995182 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:51 crc kubenswrapper[4841]: I0313 09:14:51.995313 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:51 crc kubenswrapper[4841]: E0313 09:14:51.995553 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:51 crc kubenswrapper[4841]: E0313 09:14:51.995761 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:51 crc kubenswrapper[4841]: I0313 09:14:51.995982 4841 scope.go:117] "RemoveContainer" containerID="b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec" Mar 13 09:14:51 crc kubenswrapper[4841]: E0313 09:14:51.996216 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.364388 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qkpgl_5978189d-b3a2-408c-b09e-c2b3de0a91b0/kube-multus/0.log" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.364722 4841 generic.go:334] "Generic (PLEG): container finished" podID="5978189d-b3a2-408c-b09e-c2b3de0a91b0" containerID="efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7" exitCode=1 Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.364758 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qkpgl" event={"ID":"5978189d-b3a2-408c-b09e-c2b3de0a91b0","Type":"ContainerDied","Data":"efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7"} Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.365109 4841 scope.go:117] "RemoveContainer" containerID="efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.395581 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f667de-6ffe-4f8a-b98f-944f055847c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:13:54Z\\\",\\\"message\\\":\\\"tarting file observer\\\\nW0313 09:13:53.739468 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 09:13:53.739633 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 09:13:53.740630 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-58616/tls.crt::/tmp/serving-cert-58616/tls.key\\\\\\\"\\\\nI0313 09:13:54.290916 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 09:13:54.293830 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 09:13:54.293856 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 09:13:54.293883 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 09:13:54.293893 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 09:13:54.299401 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0313 09:13:54.299409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 09:13:54.299465 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 09:13:54.299505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 09:13:54.299514 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 09:13:54.299524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 09:13:54.300966 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:13:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:52Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.418573 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:52Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.436085 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:52Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.455081 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f690f9-80ff-4833-a26b-bf70cd065ddd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bb5d2ddd18b198fb4c473b94295b9755e14c4b010c63cb78bd3cec1c51c812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b48a1313ef08ab3065fd803efd4e71b468dc9dc0c887af4c705397d07bc3444f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:12:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 09:12:24.486187 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 09:12:24.487491 1 observer_polling.go:159] Starting file observer\\\\nI0313 09:12:24.488994 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 09:12:24.490009 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 09:12:54.089987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 09:12:54.090130 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:53Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:24Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://835d958e1267f01e98aac57897a03c29b574dc64de5080416c896cd2f52430be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16119605de1f62b8d63a541d54d65d0b9ece2e285c6012dcf5a56b8aee75482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7408253e38e2ed73ba8483cc0114f0e210924d045466866931809cfe30deb850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:52Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.473550 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:52Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.490200 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:52Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.512808 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:52Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.535956 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:51Z\\\",\\\"message\\\":\\\"2026-03-13T09:14:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e56fe8bd-9123-4607-acd6-fac1c21f19b3\\\\n2026-03-13T09:14:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e56fe8bd-9123-4607-acd6-fac1c21f19b3 to /host/opt/cni/bin/\\\\n2026-03-13T09:14:06Z [verbose] multus-daemon started\\\\n2026-03-13T09:14:06Z [verbose] Readiness Indicator file check\\\\n2026-03-13T09:14:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:52Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.555740 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:52Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.573180 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:52Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.590555 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97dccbbf-4ff7-485f-a83d-26b01089e803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd735570021970d7d0266d4fb83355f1756bc3f8865ccfef1cd9af56e4a27b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:52Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.614590 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:52Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.630983 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:52Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.659671 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:32Z\\\",\\\"message\\\":\\\"rs/factory.go:160\\\\nI0313 09:14:32.034554 7145 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034701 7145 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034847 7145 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034978 7145 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 09:14:32.035103 7145 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0313 09:14:32.035176 7145 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 09:14:32.035937 7145 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 09:14:32.035968 7145 factory.go:656] Stopping watch factory\\\\nI0313 09:14:32.035980 7145 ovnkube.go:599] Stopped ovnkube\\\\nI0313 09:14:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:52Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.675638 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:52Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.692791 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:52Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:52 crc kubenswrapper[4841]: I0313 09:14:52.706203 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:52Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:53 crc kubenswrapper[4841]: E0313 09:14:53.109805 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.371758 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qkpgl_5978189d-b3a2-408c-b09e-c2b3de0a91b0/kube-multus/0.log" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.371835 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qkpgl" event={"ID":"5978189d-b3a2-408c-b09e-c2b3de0a91b0","Type":"ContainerStarted","Data":"1c399741919b7f6d151285e7dc51f48764336f17ec82afe8c58cb97bcbf9e4d9"} Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.392361 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:53Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.405865 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:53Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.421912 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c399741919b7f6d151285e7dc51f48764336f17ec82afe8c58cb97bcbf9e4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:51Z\\\",\\\"message\\\":\\\"2026-03-13T09:14:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e56fe8bd-9123-4607-acd6-fac1c21f19b3\\\\n2026-03-13T09:14:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e56fe8bd-9123-4607-acd6-fac1c21f19b3 to /host/opt/cni/bin/\\\\n2026-03-13T09:14:06Z [verbose] multus-daemon started\\\\n2026-03-13T09:14:06Z [verbose] Readiness Indicator file check\\\\n2026-03-13T09:14:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:53Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.435542 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:53Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.451908 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:53Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.473964 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f690f9-80ff-4833-a26b-bf70cd065ddd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bb5d2ddd18b198fb4c473b94295b9755e14c4b010c63cb78bd3cec1c51c812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b48a1313ef08ab3065fd803efd4e71b468dc9dc0c887af4c705397d07bc3444f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:12:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 09:12:24.486187 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 09:12:24.487491 1 observer_polling.go:159] Starting file observer\\\\nI0313 09:12:24.488994 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 09:12:24.490009 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 09:12:54.089987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 09:12:54.090130 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:53Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:24Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://835d958e1267f01e98aac57897a03c29b574dc64de5080416c896cd2f52430be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16119605de1f62b8d63a541d54d65d0b9ece2e285c6012dcf5a56b8aee75482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7408253e38e2ed73ba8483cc0114f0e210924d045466866931809cfe30deb850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:53Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.493051 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:53Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.521770 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:32Z\\\",\\\"message\\\":\\\"rs/factory.go:160\\\\nI0313 09:14:32.034554 7145 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034701 7145 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034847 7145 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034978 7145 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 09:14:32.035103 7145 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0313 09:14:32.035176 7145 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 09:14:32.035937 7145 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 09:14:32.035968 7145 factory.go:656] Stopping watch factory\\\\nI0313 09:14:32.035980 7145 ovnkube.go:599] Stopped ovnkube\\\\nI0313 09:14:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:53Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.535827 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97dccbbf-4ff7-485f-a83d-26b01089e803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd735570021970d7d0266d4fb83355f1756bc3f8865ccfef1cd9af56e4a27b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:53Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.559886 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:53Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.576505 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:53Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.590369 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:53Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.608989 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:53Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.627471 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:53Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.651684 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f667de-6ffe-4f8a-b98f-944f055847c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:13:54Z\\\",\\\"message\\\":\\\"tarting file observer\\\\nW0313 09:13:53.739468 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 09:13:53.739633 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 09:13:53.740630 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-58616/tls.crt::/tmp/serving-cert-58616/tls.key\\\\\\\"\\\\nI0313 09:13:54.290916 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 09:13:54.293830 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 09:13:54.293856 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 09:13:54.293883 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 09:13:54.293893 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 09:13:54.299401 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0313 09:13:54.299409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 09:13:54.299465 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 09:13:54.299505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 09:13:54.299514 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 09:13:54.299524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 09:13:54.300966 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:13:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:53Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.670216 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:53Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.685753 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:53Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.994685 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.994710 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.994806 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:53 crc kubenswrapper[4841]: I0313 09:14:53.994831 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:53 crc kubenswrapper[4841]: E0313 09:14:53.994952 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:53 crc kubenswrapper[4841]: E0313 09:14:53.995140 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:53 crc kubenswrapper[4841]: E0313 09:14:53.995317 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:53 crc kubenswrapper[4841]: E0313 09:14:53.995412 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:55 crc kubenswrapper[4841]: I0313 09:14:55.994873 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:55 crc kubenswrapper[4841]: I0313 09:14:55.994978 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:55 crc kubenswrapper[4841]: I0313 09:14:55.994927 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:55 crc kubenswrapper[4841]: I0313 09:14:55.994899 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:55 crc kubenswrapper[4841]: E0313 09:14:55.995181 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:55 crc kubenswrapper[4841]: E0313 09:14:55.995341 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:55 crc kubenswrapper[4841]: E0313 09:14:55.995507 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:55 crc kubenswrapper[4841]: E0313 09:14:55.995588 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.009217 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.753077 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.753150 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.753173 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.753206 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.753227 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:57Z","lastTransitionTime":"2026-03-13T09:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:57 crc kubenswrapper[4841]: E0313 09:14:57.781582 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:57Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.787724 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.787816 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.787847 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.787907 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.787931 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:57Z","lastTransitionTime":"2026-03-13T09:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:57 crc kubenswrapper[4841]: E0313 09:14:57.804142 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:57Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.808815 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.808858 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.808870 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.808890 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.808906 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:57Z","lastTransitionTime":"2026-03-13T09:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:57 crc kubenswrapper[4841]: E0313 09:14:57.829560 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:57Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.834030 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.834071 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.834083 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.834127 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.834141 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:57Z","lastTransitionTime":"2026-03-13T09:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:57 crc kubenswrapper[4841]: E0313 09:14:57.855783 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:57Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.860461 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.860522 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.860546 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.860577 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.860604 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:14:57Z","lastTransitionTime":"2026-03-13T09:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:14:57 crc kubenswrapper[4841]: E0313 09:14:57.879832 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:57Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:57 crc kubenswrapper[4841]: E0313 09:14:57.880084 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.994042 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.994121 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.994051 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:57 crc kubenswrapper[4841]: E0313 09:14:57.994238 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:57 crc kubenswrapper[4841]: I0313 09:14:57.994286 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:57 crc kubenswrapper[4841]: E0313 09:14:57.994372 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:14:57 crc kubenswrapper[4841]: E0313 09:14:57.994454 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:57 crc kubenswrapper[4841]: E0313 09:14:57.994589 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.016618 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:58Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.031045 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:58Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.043580 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f667de-6ffe-4f8a-b98f-944f055847c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:13:54Z\\\",\\\"message\\\":\\\"tarting file observer\\\\nW0313 09:13:53.739468 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 09:13:53.739633 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 09:13:53.740630 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-58616/tls.crt::/tmp/serving-cert-58616/tls.key\\\\\\\"\\\\nI0313 09:13:54.290916 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 09:13:54.293830 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 09:13:54.293856 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 09:13:54.293883 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 09:13:54.293893 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 09:13:54.299401 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0313 09:13:54.299409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 09:13:54.299465 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 09:13:54.299505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 09:13:54.299514 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 09:13:54.299524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 09:13:54.300966 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:13:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:58Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.060888 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9947e55d-81ff-4f4c-a927-f6c26688d4bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a48eb24b3fa4340905330f28451fbef8e2c41bccfe4cad4f4ec8feeaee855f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ec919eef616ef7c0b2294d8dee8897e8a317416ea98f51ec6d0cac4b999aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab9bddc78b00dff9b2fa8e6cd866192e0f4fd50674210a02c5b7c1adf80083a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6f29ab90584cbd7a9f3dd03d31693cefa4b06213817745a3fd8305f08009d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b29112f33e33bfd413b43a730aa951f1e55edaf76bf762dce2342b833382567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590511c57fa67e49a73d49d811f29ae079b6368c4d0654717bcfe4875adc22f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590511c57fa67e49a73d49d811f29ae079b6368c4d0654717bcfe4875adc22f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b38bf48d0a30b1d102b2b12b63674ccdce2d5e2930d055c3963682b91d05d93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b38bf48d0a30b1d102b2b12b63674ccdce2d5e2930d055c3963682b91d05d93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fc09bbd89cb0129ffe1030a0a36d661693c956eea136016aa90acb43a6ae684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc09bbd89cb0129ffe1030a0a36d661693c956eea136016aa90acb43a6ae684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:58Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.072272 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:58Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.086754 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:58Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.102448 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:58Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:58 crc kubenswrapper[4841]: E0313 09:14:58.110318 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.116058 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c399741919b7f6d151285e7dc51f48764336f17ec82afe8c58cb97bcbf9e4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:51Z\\\",\\\"message\\\":\\\"2026-03-13T09:14:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e56fe8bd-9123-4607-acd6-fac1c21f19b3\\\\n2026-03-13T09:14:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e56fe8bd-9123-4607-acd6-fac1c21f19b3 to /host/opt/cni/bin/\\\\n2026-03-13T09:14:06Z [verbose] multus-daemon started\\\\n2026-03-13T09:14:06Z [verbose] Readiness Indicator file check\\\\n2026-03-13T09:14:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:58Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.129080 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:58Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.138857 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:58Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.151573 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f690f9-80ff-4833-a26b-bf70cd065ddd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bb5d2ddd18b198fb4c473b94295b9755e14c4b010c63cb78bd3cec1c51c812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b48a1313ef08ab3065fd803efd4e71b468dc9dc0c887af4c705397d07bc3444f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:12:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 09:12:24.486187 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 09:12:24.487491 1 observer_polling.go:159] Starting file observer\\\\nI0313 09:12:24.488994 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 09:12:24.490009 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 09:12:54.089987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 09:12:54.090130 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:53Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:24Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://835d958e1267f01e98aac57897a03c29b574dc64de5080416c896cd2f52430be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16119605de1f62b8d63a541d54d65d0b9ece2e285c6012dcf5a56b8aee75482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7408253e38e2ed73ba8483cc0114f0e210924d045466866931809cfe30deb850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:58Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.171552 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:58Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.182175 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:58Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.210846 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:32Z\\\",\\\"message\\\":\\\"rs/factory.go:160\\\\nI0313 09:14:32.034554 7145 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034701 7145 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034847 7145 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034978 7145 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 09:14:32.035103 7145 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0313 09:14:32.035176 7145 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 09:14:32.035937 7145 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 09:14:32.035968 7145 factory.go:656] Stopping watch factory\\\\nI0313 09:14:32.035980 7145 ovnkube.go:599] Stopped ovnkube\\\\nI0313 09:14:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:58Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.223838 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97dccbbf-4ff7-485f-a83d-26b01089e803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd735570021970d7d0266d4fb83355f1756bc3f8865ccfef1cd9af56e4a27b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:58Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.239740 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:58Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.255294 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:58Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.271119 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:58Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:58 crc kubenswrapper[4841]: I0313 09:14:58.996356 4841 scope.go:117] "RemoveContainer" containerID="1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.395066 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovnkube-controller/2.log" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.398560 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerStarted","Data":"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf"} Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.398908 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.414101 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:59Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.429576 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:59Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.445664 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f690f9-80ff-4833-a26b-bf70cd065ddd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bb5d2ddd18b198fb4c473b94295b9755e14c4b010c63cb78bd3cec1c51c812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b48a1313ef08ab3065fd803efd4e71b468dc9dc0c887af4c705397d07bc3444f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:12:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 09:12:24.486187 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 09:12:24.487491 1 observer_polling.go:159] Starting file observer\\\\nI0313 09:12:24.488994 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 09:12:24.490009 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 09:12:54.089987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 09:12:54.090130 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:53Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:24Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://835d958e1267f01e98aac57897a03c29b574dc64de5080416c896cd2f52430be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16119605de1f62b8d63a541d54d65d0b9ece2e285c6012dcf5a56b8aee75482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7408253e38e2ed73ba8483cc0114f0e210924d045466866931809cfe30deb850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:59Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.469503 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9947e55d-81ff-4f4c-a927-f6c26688d4bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a48eb24b3fa4340905330f28451fbef8e2c41bccfe4cad4f4ec8feeaee855f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ec919eef616ef7c0b2294d8dee8897e8a317416ea98f51ec6d0cac4b999aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab9bddc78b00dff9b2fa8e6cd866192e0f4fd50674210a02c5b7c1adf80083a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6f29ab90584cbd7a9f3dd03d31693cefa4b06213817745a3fd8305f08009d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b29112f33e33bfd413b43a730aa951f1e55edaf76bf762dce2342b833382567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590511c57fa67e49a73d49d811f29ae079b6368c4d0654717bcfe4875adc22f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590511c57fa67e49a73d49d811f29ae079b6368c4d0654717bcfe4875adc22f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b38bf48d0a30b1d102b2b12b63674ccdce2d5e2930d055c3963682b91d05d93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b38bf48d0a30b1d102b2b12b63674ccdce2d5e2930d055c3963682b91d05d93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fc09bbd89cb0129ffe1030a0a36d661693c956eea136016aa90acb43a6ae684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc09bbd89cb0129ffe1030a0a36d661693c956eea136016aa90acb43a6ae684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:59Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.480378 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:59Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.492185 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:59Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.502253 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:59Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.512456 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c399741919b7f6d151285e7dc51f48764336f17ec82afe8c58cb97bcbf9e4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:51Z\\\",\\\"message\\\":\\\"2026-03-13T09:14:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e56fe8bd-9123-4607-acd6-fac1c21f19b3\\\\n2026-03-13T09:14:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e56fe8bd-9123-4607-acd6-fac1c21f19b3 to /host/opt/cni/bin/\\\\n2026-03-13T09:14:06Z [verbose] multus-daemon started\\\\n2026-03-13T09:14:06Z [verbose] Readiness Indicator file check\\\\n2026-03-13T09:14:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:59Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.521573 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97dccbbf-4ff7-485f-a83d-26b01089e803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd735570021970d7d0266d4fb83355f1756bc3f8865ccfef1cd9af56e4a27b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:59Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.533705 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:59Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.545298 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:59Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.565160 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:32Z\\\",\\\"message\\\":\\\"rs/factory.go:160\\\\nI0313 09:14:32.034554 7145 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034701 7145 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034847 7145 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034978 7145 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 09:14:32.035103 7145 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0313 09:14:32.035176 7145 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 09:14:32.035937 7145 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 09:14:32.035968 7145 factory.go:656] Stopping watch factory\\\\nI0313 09:14:32.035980 7145 ovnkube.go:599] Stopped ovnkube\\\\nI0313 09:14:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:59Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.577310 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:59Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.590636 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:59Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.614309 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:59Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.627517 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f667de-6ffe-4f8a-b98f-944f055847c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:13:54Z\\\",\\\"message\\\":\\\"tarting file observer\\\\nW0313 09:13:53.739468 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 09:13:53.739633 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 09:13:53.740630 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-58616/tls.crt::/tmp/serving-cert-58616/tls.key\\\\\\\"\\\\nI0313 09:13:54.290916 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 09:13:54.293830 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 09:13:54.293856 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 09:13:54.293883 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 09:13:54.293893 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 09:13:54.299401 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0313 09:13:54.299409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 09:13:54.299465 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 09:13:54.299505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 09:13:54.299514 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 09:13:54.299524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 09:13:54.300966 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:13:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:59Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.643155 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:59Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.652568 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:14:59Z is after 2025-08-24T17:21:41Z" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.994022 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.994083 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.994041 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:14:59 crc kubenswrapper[4841]: E0313 09:14:59.994176 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:14:59 crc kubenswrapper[4841]: I0313 09:14:59.994387 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:14:59 crc kubenswrapper[4841]: E0313 09:14:59.994373 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:14:59 crc kubenswrapper[4841]: E0313 09:14:59.994533 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:14:59 crc kubenswrapper[4841]: E0313 09:14:59.994637 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.405306 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovnkube-controller/3.log" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.406432 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovnkube-controller/2.log" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.411479 4841 generic.go:334] "Generic (PLEG): container finished" podID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerID="46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf" exitCode=1 Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.411560 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerDied","Data":"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf"} Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.411636 4841 scope.go:117] "RemoveContainer" containerID="1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.412511 4841 scope.go:117] "RemoveContainer" containerID="46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf" Mar 13 09:15:00 crc kubenswrapper[4841]: E0313 09:15:00.412773 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.433066 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:00Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.455940 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f690f9-80ff-4833-a26b-bf70cd065ddd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bb5d2ddd18b198fb4c473b94295b9755e14c4b010c63cb78bd3cec1c51c812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b48a1313ef08ab3065fd803efd4e71b468dc9dc0c887af4c705397d07bc3444f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:12:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 09:12:24.486187 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 09:12:24.487491 1 observer_polling.go:159] Starting file observer\\\\nI0313 09:12:24.488994 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 09:12:24.490009 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 09:12:54.089987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 09:12:54.090130 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:53Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:24Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://835d958e1267f01e98aac57897a03c29b574dc64de5080416c896cd2f52430be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16119605de1f62b8d63a541d54d65d0b9ece2e285c6012dcf5a56b8aee75482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7408253e38e2ed73ba8483cc0114f0e210924d045466866931809cfe30deb850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:00Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.490871 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9947e55d-81ff-4f4c-a927-f6c26688d4bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a48eb24b3fa4340905330f28451fbef8e2c41bccfe4cad4f4ec8feeaee855f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ec919eef616ef7c0b2294d8dee8897e8a317416ea98f51ec6d0cac4b999aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab9bddc78b00dff9b2fa8e6cd866192e0f4fd50674210a02c5b7c1adf80083a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6f29ab90584cbd7a9f3dd03d31693cefa4b06213817745a3fd8305f08009d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b29112f33e33bfd413b43a730aa951f1e55edaf76bf762dce2342b833382567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590511c57fa67e49a73d49d811f29ae079b6368c4d0654717bcfe4875adc22f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590511c57fa67e49a73d49d811f29ae079b6368c4d0654717bcfe4875adc22f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b38bf48d0a30b1d102b2b12b63674ccdce2d5e2930d055c3963682b91d05d93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b38bf48d0a30b1d102b2b12b63674ccdce2d5e2930d055c3963682b91d05d93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fc09bbd89cb0129ffe1030a0a36d661693c956eea136016aa90acb43a6ae684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc09bbd89cb0129ffe1030a0a36d661693c956eea136016aa90acb43a6ae684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:00Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.513131 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:00Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.530161 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:00Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.549391 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:00Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.571577 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c399741919b7f6d151285e7dc51f48764336f17ec82afe8c58cb97bcbf9e4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:51Z\\\",\\\"message\\\":\\\"2026-03-13T09:14:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e56fe8bd-9123-4607-acd6-fac1c21f19b3\\\\n2026-03-13T09:14:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e56fe8bd-9123-4607-acd6-fac1c21f19b3 to /host/opt/cni/bin/\\\\n2026-03-13T09:14:06Z [verbose] multus-daemon started\\\\n2026-03-13T09:14:06Z [verbose] Readiness Indicator file check\\\\n2026-03-13T09:14:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:00Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.587494 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:00Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.601904 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97dccbbf-4ff7-485f-a83d-26b01089e803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd735570021970d7d0266d4fb83355f1756bc3f8865ccfef1cd9af56e4a27b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:00Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.615918 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:00Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.626434 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:00Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.656985 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e1a04ef43373ded6d11ce40156b4d20761efd5fa0083cea17910ad236e221a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:32Z\\\",\\\"message\\\":\\\"rs/factory.go:160\\\\nI0313 09:14:32.034554 7145 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034701 7145 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034847 7145 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 09:14:32.034978 7145 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 09:14:32.035103 7145 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0313 09:14:32.035176 7145 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 09:14:32.035937 7145 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 09:14:32.035968 7145 factory.go:656] Stopping watch factory\\\\nI0313 09:14:32.035980 7145 ovnkube.go:599] Stopped ovnkube\\\\nI0313 09:14:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:59Z\\\",\\\"message\\\":\\\"ing *v1.Namespace event handler 5 for removal\\\\nI0313 09:14:59.861868 7476 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 09:14:59.861877 7476 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 09:14:59.861892 7476 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 09:14:59.861906 7476 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 09:14:59.861926 7476 factory.go:656] Stopping watch factory\\\\nI0313 09:14:59.861941 7476 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 09:14:59.861773 7476 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 09:14:59.861960 7476 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 09:14:59.861973 7476 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 09:14:59.861979 7476 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 09:14:59.861799 7476 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 09:14:59.861984 7476 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 09:14:59.862180 7476 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:00Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.672524 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:00Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.689797 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:00Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.706157 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:00Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.726924 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f667de-6ffe-4f8a-b98f-944f055847c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:13:54Z\\\",\\\"message\\\":\\\"tarting file observer\\\\nW0313 09:13:53.739468 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 09:13:53.739633 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 09:13:53.740630 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-58616/tls.crt::/tmp/serving-cert-58616/tls.key\\\\\\\"\\\\nI0313 09:13:54.290916 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 09:13:54.293830 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 09:13:54.293856 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 09:13:54.293883 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 09:13:54.293893 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 09:13:54.299401 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0313 09:13:54.299409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 09:13:54.299465 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 09:13:54.299505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 09:13:54.299514 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 09:13:54.299524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 09:13:54.300966 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:13:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:00Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.746091 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:00Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:00 crc kubenswrapper[4841]: I0313 09:15:00.759780 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:00Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.419075 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovnkube-controller/3.log" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.424256 4841 scope.go:117] "RemoveContainer" containerID="46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf" Mar 13 09:15:01 crc kubenswrapper[4841]: E0313 09:15:01.424535 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.445196 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:01Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.465124 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:01Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.484740 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:01Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.506603 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f667de-6ffe-4f8a-b98f-944f055847c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:13:54Z\\\",\\\"message\\\":\\\"tarting file observer\\\\nW0313 09:13:53.739468 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 09:13:53.739633 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 09:13:53.740630 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-58616/tls.crt::/tmp/serving-cert-58616/tls.key\\\\\\\"\\\\nI0313 09:13:54.290916 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 09:13:54.293830 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 09:13:54.293856 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 09:13:54.293883 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 09:13:54.293893 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 09:13:54.299401 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0313 09:13:54.299409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 09:13:54.299465 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 09:13:54.299505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 09:13:54.299514 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 09:13:54.299524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 09:13:54.300966 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:13:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:01Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.527322 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:01Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.544259 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:01Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.567888 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f690f9-80ff-4833-a26b-bf70cd065ddd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bb5d2ddd18b198fb4c473b94295b9755e14c4b010c63cb78bd3cec1c51c812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b48a1313ef08ab3065fd803efd4e71b468dc9dc0c887af4c705397d07bc3444f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:12:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 09:12:24.486187 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 09:12:24.487491 1 observer_polling.go:159] Starting file observer\\\\nI0313 09:12:24.488994 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 09:12:24.490009 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 09:12:54.089987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 09:12:54.090130 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:53Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:24Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://835d958e1267f01e98aac57897a03c29b574dc64de5080416c896cd2f52430be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16119605de1f62b8d63a541d54d65d0b9ece2e285c6012dcf5a56b8aee75482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7408253e38e2ed73ba8483cc0114f0e210924d045466866931809cfe30deb850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:01Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.594698 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9947e55d-81ff-4f4c-a927-f6c26688d4bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a48eb24b3fa4340905330f28451fbef8e2c41bccfe4cad4f4ec8feeaee855f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ec919eef616ef7c0b2294d8dee8897e8a317416ea98f51ec6d0cac4b999aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab9bddc78b00dff9b2fa8e6cd866192e0f4fd50674210a02c5b7c1adf80083a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6f29ab90584cbd7a9f3dd03d31693cefa4b06213817745a3fd8305f08009d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b29112f33e33bfd413b43a730aa951f1e55edaf76bf762dce2342b833382567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590511c57fa67e49a73d49d811f29ae079b6368c4d0654717bcfe4875adc22f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590511c57fa67e49a73d49d811f29ae079b6368c4d0654717bcfe4875adc22f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b38bf48d0a30b1d102b2b12b63674ccdce2d5e2930d055c3963682b91d05d93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b38bf48d0a30b1d102b2b12b63674ccdce2d5e2930d055c3963682b91d05d93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fc09bbd89cb0129ffe1030a0a36d661693c956eea136016aa90acb43a6ae684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc09bbd89cb0129ffe1030a0a36d661693c956eea136016aa90acb43a6ae684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:01Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.605494 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:01Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.614674 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:01Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.626514 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:01Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.639070 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c399741919b7f6d151285e7dc51f48764336f17ec82afe8c58cb97bcbf9e4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:51Z\\\",\\\"message\\\":\\\"2026-03-13T09:14:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e56fe8bd-9123-4607-acd6-fac1c21f19b3\\\\n2026-03-13T09:14:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e56fe8bd-9123-4607-acd6-fac1c21f19b3 to /host/opt/cni/bin/\\\\n2026-03-13T09:14:06Z [verbose] multus-daemon started\\\\n2026-03-13T09:14:06Z [verbose] Readiness Indicator file check\\\\n2026-03-13T09:14:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:01Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.651993 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:01Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.665846 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:01Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.678220 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97dccbbf-4ff7-485f-a83d-26b01089e803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd735570021970d7d0266d4fb83355f1756bc3f8865ccfef1cd9af56e4a27b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:01Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.693547 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:01Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.706876 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:01Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.723816 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:59Z\\\",\\\"message\\\":\\\"ing *v1.Namespace event handler 5 for removal\\\\nI0313 09:14:59.861868 7476 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 09:14:59.861877 7476 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 09:14:59.861892 7476 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 09:14:59.861906 7476 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 09:14:59.861926 7476 factory.go:656] Stopping watch factory\\\\nI0313 09:14:59.861941 7476 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 09:14:59.861773 7476 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 09:14:59.861960 7476 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 09:14:59.861973 7476 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 09:14:59.861979 7476 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 09:14:59.861799 7476 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 09:14:59.861984 7476 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 09:14:59.862180 7476 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:01Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.994233 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.994321 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.994337 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:01 crc kubenswrapper[4841]: E0313 09:15:01.994424 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:01 crc kubenswrapper[4841]: I0313 09:15:01.994439 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:01 crc kubenswrapper[4841]: E0313 09:15:01.994591 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:01 crc kubenswrapper[4841]: E0313 09:15:01.998573 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:01 crc kubenswrapper[4841]: E0313 09:15:01.998924 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:02 crc kubenswrapper[4841]: I0313 09:15:02.994905 4841 scope.go:117] "RemoveContainer" containerID="b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec" Mar 13 09:15:02 crc kubenswrapper[4841]: E0313 09:15:02.995248 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 09:15:03 crc kubenswrapper[4841]: E0313 09:15:03.112128 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:15:03 crc kubenswrapper[4841]: I0313 09:15:03.994525 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:03 crc kubenswrapper[4841]: I0313 09:15:03.994737 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:03 crc kubenswrapper[4841]: E0313 09:15:03.994810 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:03 crc kubenswrapper[4841]: I0313 09:15:03.995178 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:03 crc kubenswrapper[4841]: I0313 09:15:03.995352 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:03 crc kubenswrapper[4841]: E0313 09:15:03.995508 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:03 crc kubenswrapper[4841]: E0313 09:15:03.995646 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:03 crc kubenswrapper[4841]: E0313 09:15:03.995819 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:05 crc kubenswrapper[4841]: I0313 09:15:05.994431 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:05 crc kubenswrapper[4841]: E0313 09:15:05.994937 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:05 crc kubenswrapper[4841]: I0313 09:15:05.994510 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:05 crc kubenswrapper[4841]: I0313 09:15:05.994592 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:05 crc kubenswrapper[4841]: I0313 09:15:05.994557 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:05 crc kubenswrapper[4841]: E0313 09:15:05.995234 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:05 crc kubenswrapper[4841]: E0313 09:15:05.995299 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:05 crc kubenswrapper[4841]: E0313 09:15:05.995359 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.912525 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.912593 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.912613 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.912647 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.912666 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:15:07Z","lastTransitionTime":"2026-03-13T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.935153 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:07Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.941355 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.941412 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.941431 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.941457 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.941474 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:15:07Z","lastTransitionTime":"2026-03-13T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.946679 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.946841 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.946913 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.947000 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.947070 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.947188 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.947319 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 09:16:11.947250403 +0000 UTC m=+254.677150634 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.947653 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.947703 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.947749 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 09:16:11.947728378 +0000 UTC m=+254.677628609 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.947755 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.947771 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.947787 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.947819 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.947848 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.947789 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:11.94777184 +0000 UTC m=+254.677672081 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.947942 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 09:16:11.947892643 +0000 UTC m=+254.677792874 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.948003 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 09:16:11.947983126 +0000 UTC m=+254.677883447 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.962539 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:07Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.967372 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.967524 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.967547 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.967624 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.967646 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:15:07Z","lastTransitionTime":"2026-03-13T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.990791 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:07Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.994360 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.994570 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.994676 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.994730 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.994859 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.994914 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.995108 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:07 crc kubenswrapper[4841]: E0313 09:15:07.995334 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.996390 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.996444 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.996461 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.996487 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:15:07 crc kubenswrapper[4841]: I0313 09:15:07.996505 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:15:07Z","lastTransitionTime":"2026-03-13T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:15:08 crc kubenswrapper[4841]: E0313 09:15:08.020365 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.021113 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.022714 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f667de-6ffe-4f8a-b98f-944f055847c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:13:54Z\\\",\\\"message\\\":\\\"tarting file observer\\\\nW0313 09:13:53.739468 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 09:13:53.739633 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 09:13:53.740630 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-58616/tls.crt::/tmp/serving-cert-58616/tls.key\\\\\\\"\\\\nI0313 09:13:54.290916 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 09:13:54.293830 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 09:13:54.293856 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 09:13:54.293883 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 09:13:54.293893 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 09:13:54.299401 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0313 09:13:54.299409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 09:13:54.299465 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 09:13:54.299495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 09:13:54.299505 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 09:13:54.299514 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 09:13:54.299524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 09:13:54.300966 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:13:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.027611 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.027655 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.027677 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.027707 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.027730 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:15:08Z","lastTransitionTime":"2026-03-13T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.047426 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c25810600605117de8aff09e594bdc24d3fee29b572b64ef6e1d0e3ba70507f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cf7ec9cd1c186a2f127e5f5e06ddedf7ae19654ccc765fd61fa5bb71a21793b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: E0313 09:15:08.051481 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T09:15:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T09:15:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"256dcd84-6905-44f2-86c8-027636af7678\\\",\\\"systemUUID\\\":\\\"ea440d89-cc01-4ae5-8bed-355549945eed\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: E0313 09:15:08.051726 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.063494 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zlzg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2120c73-4dda-4576-bad1-48858477b17c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2043cdf87fac38ef44868d77cfe14eb432a5603a887a167d2a19c49a5c54502b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kj6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zlzg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.083725 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.103243 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qkpgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5978189d-b3a2-408c-b09e-c2b3de0a91b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c399741919b7f6d151285e7dc51f48764336f17ec82afe8c58cb97bcbf9e4d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:51Z\\\",\\\"message\\\":\\\"2026-03-13T09:14:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e56fe8bd-9123-4607-acd6-fac1c21f19b3\\\\n2026-03-13T09:14:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e56fe8bd-9123-4607-acd6-fac1c21f19b3 to /host/opt/cni/bin/\\\\n2026-03-13T09:14:06Z [verbose] multus-daemon started\\\\n2026-03-13T09:14:06Z [verbose] Readiness Indicator file check\\\\n2026-03-13T09:14:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ct29g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qkpgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: E0313 09:15:08.112723 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.118411 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"938e7b48-e79d-485d-abe7-d8cf5beeeb4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6c1f09d95eeb9d35faaff36bcacfad0f52f8c68f588a4707476c2246f79d615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5c0affead8494ade11329b370b4ae74f6bdc9e73d119a4796c1c85b7b7841f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjtjn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r2tpw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.133505 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5lxzk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41c5e4ef-f068-4d97-b4c9-b2085cc97422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bfd53d02fa3c0fedc9c04cdad1b74406f04191060791f735be93a619dcb1913\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mghth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5lxzk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.149090 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs\") pod \"network-metrics-daemon-5t7sg\" (UID: \"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\") " pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:08 crc kubenswrapper[4841]: E0313 09:15:08.149423 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 09:15:08 crc kubenswrapper[4841]: E0313 09:15:08.149699 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs podName:ea07a392-2a1f-4bae-bb67-db7cd421c1e1 nodeName:}" failed. No retries permitted until 2026-03-13 09:16:12.1496575 +0000 UTC m=+254.879557731 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs") pod "network-metrics-daemon-5t7sg" (UID: "ea07a392-2a1f-4bae-bb67-db7cd421c1e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.150767 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53f690f9-80ff-4833-a26b-bf70cd065ddd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:13:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3bb5d2ddd18b198fb4c473b94295b9755e14c4b010c63cb78bd3cec1c51c812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b48a1313ef08ab3065fd803efd4e71b468dc9dc0c887af4c705397d07bc3444f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T09:12:54Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 09:12:24.486187 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 09:12:24.487491 1 observer_polling.go:159] Starting file observer\\\\nI0313 09:12:24.488994 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 09:12:24.490009 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 09:12:54.089987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 09:12:54.090130 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:12:53Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:24Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://835d958e1267f01e98aac57897a03c29b574dc64de5080416c896cd2f52430be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16119605de1f62b8d63a541d54d65d0b9ece2e285c6012dcf5a56b8aee75482\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7408253e38e2ed73ba8483cc0114f0e210924d045466866931809cfe30deb850\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.182113 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9947e55d-81ff-4f4c-a927-f6c26688d4bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a48eb24b3fa4340905330f28451fbef8e2c41bccfe4cad4f4ec8feeaee855f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68ec919eef616ef7c0b2294d8dee8897e8a317416ea98f51ec6d0cac4b999aea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab9bddc78b00dff9b2fa8e6cd866192e0f4fd50674210a02c5b7c1adf80083a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6f29ab90584cbd7a9f3dd03d31693cefa4b06213817745a3fd8305f08009d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b29112f33e33bfd413b43a730aa951f1e55edaf76bf762dce2342b833382567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590511c57fa67e49a73d49d811f29ae079b6368c4d0654717bcfe4875adc22f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590511c57fa67e49a73d49d811f29ae079b6368c4d0654717bcfe4875adc22f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b38bf48d0a30b1d102b2b12b63674ccdce2d5e2930d055c3963682b91d05d93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b38bf48d0a30b1d102b2b12b63674ccdce2d5e2930d055c3963682b91d05d93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fc09bbd89cb0129ffe1030a0a36d661693c956eea136016aa90acb43a6ae684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fc09bbd89cb0129ffe1030a0a36d661693c956eea136016aa90acb43a6ae684\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.198052 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03e0f5aae191917dc6687d37088dc328eefd1e223f0f9547d46abc0d205ee49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.212736 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c856f36493978942e073ccf53f765f4bff8249e57c8ce5c48d9023b144a7d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.228961 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97dccbbf-4ff7-485f-a83d-26b01089e803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd735570021970d7d0266d4fb83355f1756bc3f8865ccfef1cd9af56e4a27b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7046389acb5adf17631c174269fc8213d96c97b34d7fec4fb806f0694f386a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:11:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.252413 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-948g6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5388897d-03d2-4551-a457-515f576d4621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80f14e2678bea6046c0e4b1ce1c43be454e1476b4ea33f8f5a98afd406089958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d127237eff9c3db9664e0070d7c898671b0537ba3bf61a1a830cefbf6a99f062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c5d0e1946b25c9268f9b59c55d589193c44ef978f33419375fc97d0df965b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69dda411452f3f10fe52ade851f621ef11120fc3dad9c66a12994730f35f6f94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a5b7c4c75217c2f3577b7b3d453e85dbe69510ee9a9fd15511442ffb73771bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8db673e78c038da8c45d32900ffd3d1dcae457a3bc1a68c146f25b61494d8e5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2500e1321bbf150aeff0b216471ad2cee5ca10f9b1061d66d794d7df9871646a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppb67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-948g6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.266072 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5l7xt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5t7sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.287256 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db500a1d-2be8-49c1-9c9e-af7623d16b15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T09:14:59Z\\\",\\\"message\\\":\\\"ing *v1.Namespace event handler 5 for removal\\\\nI0313 09:14:59.861868 7476 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 09:14:59.861877 7476 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 09:14:59.861892 7476 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 09:14:59.861906 7476 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 09:14:59.861926 7476 factory.go:656] Stopping watch factory\\\\nI0313 09:14:59.861941 7476 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 09:14:59.861773 7476 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 09:14:59.861960 7476 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 09:14:59.861973 7476 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 09:14:59.861979 7476 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 09:14:59.861799 7476 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 09:14:59.861984 7476 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 09:14:59.862180 7476 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T09:14:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qvg6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j5szf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.305925 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.318481 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:08 crc kubenswrapper[4841]: I0313 09:15:08.328353 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T09:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4d3ddbffdfcfe4aeddf0b811a3d0b5d6a2827712943a8f3ff453261f5dae0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T09:14:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmc42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T09:14:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-h227v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T09:15:08Z is after 2025-08-24T17:21:41Z" Mar 13 09:15:09 crc kubenswrapper[4841]: I0313 09:15:09.994959 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:09 crc kubenswrapper[4841]: I0313 09:15:09.994993 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:09 crc kubenswrapper[4841]: I0313 09:15:09.995085 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:09 crc kubenswrapper[4841]: E0313 09:15:09.995107 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:09 crc kubenswrapper[4841]: I0313 09:15:09.995163 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:09 crc kubenswrapper[4841]: E0313 09:15:09.995206 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:09 crc kubenswrapper[4841]: E0313 09:15:09.995401 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:09 crc kubenswrapper[4841]: E0313 09:15:09.995580 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:11 crc kubenswrapper[4841]: I0313 09:15:11.994964 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:11 crc kubenswrapper[4841]: E0313 09:15:11.995215 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:11 crc kubenswrapper[4841]: I0313 09:15:11.995577 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:11 crc kubenswrapper[4841]: I0313 09:15:11.995647 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:11 crc kubenswrapper[4841]: I0313 09:15:11.995573 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:11 crc kubenswrapper[4841]: E0313 09:15:11.995665 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:11 crc kubenswrapper[4841]: E0313 09:15:11.995874 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:11 crc kubenswrapper[4841]: E0313 09:15:11.996385 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:11 crc kubenswrapper[4841]: I0313 09:15:11.996966 4841 scope.go:117] "RemoveContainer" containerID="46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf" Mar 13 09:15:11 crc kubenswrapper[4841]: E0313 09:15:11.997206 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" Mar 13 09:15:13 crc kubenswrapper[4841]: E0313 09:15:13.114151 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:15:13 crc kubenswrapper[4841]: I0313 09:15:13.994547 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:13 crc kubenswrapper[4841]: I0313 09:15:13.994648 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:13 crc kubenswrapper[4841]: I0313 09:15:13.994666 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:13 crc kubenswrapper[4841]: I0313 09:15:13.995446 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:13 crc kubenswrapper[4841]: E0313 09:15:13.995635 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:13 crc kubenswrapper[4841]: E0313 09:15:13.995802 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:13 crc kubenswrapper[4841]: E0313 09:15:13.995974 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:13 crc kubenswrapper[4841]: E0313 09:15:13.996096 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:14 crc kubenswrapper[4841]: I0313 09:15:14.996185 4841 scope.go:117] "RemoveContainer" containerID="b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec" Mar 13 09:15:15 crc kubenswrapper[4841]: I0313 09:15:15.474446 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 13 09:15:15 crc kubenswrapper[4841]: I0313 09:15:15.477976 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"258c37d4d3a4e8eb1c14a049165aca26c6a6883ad7a7b3c83f01439c9035baf4"} Mar 13 09:15:15 crc kubenswrapper[4841]: I0313 09:15:15.478638 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:15:15 crc kubenswrapper[4841]: I0313 09:15:15.543810 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zlzg8" podStartSLOduration=128.543788195 podStartE2EDuration="2m8.543788195s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:15:15.525910424 +0000 UTC m=+198.255810625" watchObservedRunningTime="2026-03-13 09:15:15.543788195 +0000 UTC m=+198.273688396" Mar 13 09:15:15 crc kubenswrapper[4841]: I0313 09:15:15.579662 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=61.579643019 podStartE2EDuration="1m1.579643019s" podCreationTimestamp="2026-03-13 09:14:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:15:15.545945222 +0000 UTC m=+198.275845423" watchObservedRunningTime="2026-03-13 09:15:15.579643019 +0000 UTC m=+198.309543220" Mar 13 09:15:15 crc kubenswrapper[4841]: I0313 09:15:15.601622 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=18.601597498 podStartE2EDuration="18.601597498s" podCreationTimestamp="2026-03-13 09:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:15:15.580053652 +0000 UTC m=+198.309953883" watchObservedRunningTime="2026-03-13 09:15:15.601597498 +0000 UTC m=+198.331497699" Mar 13 09:15:15 crc kubenswrapper[4841]: I0313 09:15:15.659943 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qkpgl" podStartSLOduration=128.659924777 podStartE2EDuration="2m8.659924777s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:15:15.659885825 +0000 UTC m=+198.389786026" watchObservedRunningTime="2026-03-13 09:15:15.659924777 +0000 UTC m=+198.389824968" Mar 13 09:15:15 crc kubenswrapper[4841]: I0313 09:15:15.689073 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2tpw" podStartSLOduration=128.68905402 podStartE2EDuration="2m8.68905402s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:15:15.676667682 +0000 UTC m=+198.406567893" watchObservedRunningTime="2026-03-13 09:15:15.68905402 +0000 UTC m=+198.418954211" Mar 13 09:15:15 crc kubenswrapper[4841]: I0313 09:15:15.719016 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5lxzk" podStartSLOduration=128.718998879 podStartE2EDuration="2m8.718998879s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:15:15.689466884 +0000 UTC m=+198.419367075" watchObservedRunningTime="2026-03-13 09:15:15.718998879 +0000 UTC m=+198.448899070" Mar 13 09:15:15 crc kubenswrapper[4841]: I0313 09:15:15.746116 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=44.746089289 podStartE2EDuration="44.746089289s" podCreationTimestamp="2026-03-13 09:14:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:15:15.719180115 +0000 UTC m=+198.449080356" watchObservedRunningTime="2026-03-13 09:15:15.746089289 +0000 UTC m=+198.475989520" Mar 13 09:15:15 crc kubenswrapper[4841]: I0313 09:15:15.746334 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-948g6" podStartSLOduration=128.746323946 podStartE2EDuration="2m8.746323946s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:15:15.744977244 +0000 UTC m=+198.474877445" watchObservedRunningTime="2026-03-13 09:15:15.746323946 +0000 UTC m=+198.476224177" Mar 13 09:15:15 crc kubenswrapper[4841]: I0313 09:15:15.850148 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=40.850131362 podStartE2EDuration="40.850131362s" podCreationTimestamp="2026-03-13 09:14:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:15:15.836756482 +0000 UTC m=+198.566656673" watchObservedRunningTime="2026-03-13 09:15:15.850131362 +0000 UTC m=+198.580031553" Mar 13 09:15:15 crc kubenswrapper[4841]: I0313 09:15:15.874945 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podStartSLOduration=128.874932439 podStartE2EDuration="2m8.874932439s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:15:15.874457525 +0000 UTC m=+198.604357716" watchObservedRunningTime="2026-03-13 09:15:15.874932439 +0000 UTC m=+198.604832630" Mar 13 09:15:15 crc kubenswrapper[4841]: I0313 09:15:15.889018 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=7.889003751 podStartE2EDuration="7.889003751s" podCreationTimestamp="2026-03-13 09:15:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:15:15.888574937 +0000 UTC m=+198.618475128" watchObservedRunningTime="2026-03-13 09:15:15.889003751 +0000 UTC m=+198.618903942" Mar 13 09:15:15 crc kubenswrapper[4841]: I0313 09:15:15.994437 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:15 crc kubenswrapper[4841]: I0313 09:15:15.994534 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:15 crc kubenswrapper[4841]: I0313 09:15:15.994698 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:15 crc kubenswrapper[4841]: E0313 09:15:15.994827 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:15 crc kubenswrapper[4841]: I0313 09:15:15.994728 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:15 crc kubenswrapper[4841]: E0313 09:15:15.994968 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:15 crc kubenswrapper[4841]: E0313 09:15:15.995057 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:15 crc kubenswrapper[4841]: E0313 09:15:15.995222 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:17 crc kubenswrapper[4841]: I0313 09:15:17.994339 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:17 crc kubenswrapper[4841]: I0313 09:15:17.994425 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:17 crc kubenswrapper[4841]: I0313 09:15:17.994438 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:17 crc kubenswrapper[4841]: I0313 09:15:17.994453 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:17 crc kubenswrapper[4841]: E0313 09:15:17.995726 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:17 crc kubenswrapper[4841]: E0313 09:15:17.995850 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:17 crc kubenswrapper[4841]: E0313 09:15:17.995955 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:17 crc kubenswrapper[4841]: E0313 09:15:17.996050 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:18 crc kubenswrapper[4841]: E0313 09:15:18.115183 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.121191 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.121239 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.121256 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.121305 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.121323 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T09:15:18Z","lastTransitionTime":"2026-03-13T09:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.185075 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p"] Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.185775 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.189347 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.189373 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.189658 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.190541 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.295969 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/75d62dc5-9f82-4f26-95fb-03be0f449776-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m424p\" (UID: \"75d62dc5-9f82-4f26-95fb-03be0f449776\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.296067 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75d62dc5-9f82-4f26-95fb-03be0f449776-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m424p\" (UID: \"75d62dc5-9f82-4f26-95fb-03be0f449776\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.296122 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75d62dc5-9f82-4f26-95fb-03be0f449776-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m424p\" (UID: \"75d62dc5-9f82-4f26-95fb-03be0f449776\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.296337 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/75d62dc5-9f82-4f26-95fb-03be0f449776-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m424p\" (UID: \"75d62dc5-9f82-4f26-95fb-03be0f449776\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.296430 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75d62dc5-9f82-4f26-95fb-03be0f449776-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m424p\" (UID: \"75d62dc5-9f82-4f26-95fb-03be0f449776\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.397644 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/75d62dc5-9f82-4f26-95fb-03be0f449776-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m424p\" (UID: \"75d62dc5-9f82-4f26-95fb-03be0f449776\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.397725 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75d62dc5-9f82-4f26-95fb-03be0f449776-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m424p\" (UID: \"75d62dc5-9f82-4f26-95fb-03be0f449776\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.397760 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75d62dc5-9f82-4f26-95fb-03be0f449776-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m424p\" (UID: \"75d62dc5-9f82-4f26-95fb-03be0f449776\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.397819 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/75d62dc5-9f82-4f26-95fb-03be0f449776-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m424p\" (UID: \"75d62dc5-9f82-4f26-95fb-03be0f449776\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.397876 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75d62dc5-9f82-4f26-95fb-03be0f449776-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m424p\" (UID: \"75d62dc5-9f82-4f26-95fb-03be0f449776\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.397914 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/75d62dc5-9f82-4f26-95fb-03be0f449776-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m424p\" (UID: \"75d62dc5-9f82-4f26-95fb-03be0f449776\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.398057 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/75d62dc5-9f82-4f26-95fb-03be0f449776-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m424p\" (UID: \"75d62dc5-9f82-4f26-95fb-03be0f449776\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.400981 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75d62dc5-9f82-4f26-95fb-03be0f449776-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m424p\" (UID: \"75d62dc5-9f82-4f26-95fb-03be0f449776\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.407345 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75d62dc5-9f82-4f26-95fb-03be0f449776-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m424p\" (UID: \"75d62dc5-9f82-4f26-95fb-03be0f449776\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.430860 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75d62dc5-9f82-4f26-95fb-03be0f449776-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m424p\" (UID: \"75d62dc5-9f82-4f26-95fb-03be0f449776\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" Mar 13 09:15:18 crc kubenswrapper[4841]: I0313 09:15:18.514178 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" Mar 13 09:15:19 crc kubenswrapper[4841]: I0313 09:15:19.063098 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 13 09:15:19 crc kubenswrapper[4841]: I0313 09:15:19.072650 4841 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 09:15:19 crc kubenswrapper[4841]: I0313 09:15:19.493921 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" event={"ID":"75d62dc5-9f82-4f26-95fb-03be0f449776","Type":"ContainerStarted","Data":"143584a8750e16f16a614ffdc9348a2fd14bc6fe669e3fa322074afd8a6d14b9"} Mar 13 09:15:19 crc kubenswrapper[4841]: I0313 09:15:19.493980 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" event={"ID":"75d62dc5-9f82-4f26-95fb-03be0f449776","Type":"ContainerStarted","Data":"40ecbb1d258ca13f45f132a4ab5906deb7b6ef43e49a349bd1d8760f7e9297d9"} Mar 13 09:15:19 crc kubenswrapper[4841]: I0313 09:15:19.513825 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m424p" podStartSLOduration=132.513797681 podStartE2EDuration="2m12.513797681s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:15:19.513039317 +0000 UTC m=+202.242939548" watchObservedRunningTime="2026-03-13 09:15:19.513797681 +0000 UTC m=+202.243697912" Mar 13 09:15:19 crc kubenswrapper[4841]: I0313 09:15:19.994820 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:19 crc kubenswrapper[4841]: I0313 09:15:19.994964 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:19 crc kubenswrapper[4841]: E0313 09:15:19.995005 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:19 crc kubenswrapper[4841]: I0313 09:15:19.995048 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:19 crc kubenswrapper[4841]: E0313 09:15:19.995228 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:19 crc kubenswrapper[4841]: E0313 09:15:19.995450 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:19 crc kubenswrapper[4841]: I0313 09:15:19.995618 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:19 crc kubenswrapper[4841]: E0313 09:15:19.995872 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:21 crc kubenswrapper[4841]: I0313 09:15:21.994988 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:21 crc kubenswrapper[4841]: I0313 09:15:21.995049 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:21 crc kubenswrapper[4841]: I0313 09:15:21.995119 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:21 crc kubenswrapper[4841]: I0313 09:15:21.995170 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:21 crc kubenswrapper[4841]: E0313 09:15:21.995184 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:21 crc kubenswrapper[4841]: E0313 09:15:21.995402 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:21 crc kubenswrapper[4841]: E0313 09:15:21.995561 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:21 crc kubenswrapper[4841]: E0313 09:15:21.995751 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:23 crc kubenswrapper[4841]: E0313 09:15:23.116575 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:15:23 crc kubenswrapper[4841]: I0313 09:15:23.995093 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:23 crc kubenswrapper[4841]: I0313 09:15:23.995176 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:23 crc kubenswrapper[4841]: I0313 09:15:23.995114 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:23 crc kubenswrapper[4841]: I0313 09:15:23.995099 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:23 crc kubenswrapper[4841]: E0313 09:15:23.995354 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:23 crc kubenswrapper[4841]: E0313 09:15:23.995453 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:23 crc kubenswrapper[4841]: E0313 09:15:23.995574 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:23 crc kubenswrapper[4841]: E0313 09:15:23.995808 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:25 crc kubenswrapper[4841]: I0313 09:15:25.995012 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:25 crc kubenswrapper[4841]: I0313 09:15:25.995059 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:25 crc kubenswrapper[4841]: I0313 09:15:25.995028 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:25 crc kubenswrapper[4841]: I0313 09:15:25.995008 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:25 crc kubenswrapper[4841]: E0313 09:15:25.995220 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:25 crc kubenswrapper[4841]: E0313 09:15:25.995288 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:25 crc kubenswrapper[4841]: E0313 09:15:25.995500 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:25 crc kubenswrapper[4841]: E0313 09:15:25.996489 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:25 crc kubenswrapper[4841]: I0313 09:15:25.997335 4841 scope.go:117] "RemoveContainer" containerID="46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf" Mar 13 09:15:25 crc kubenswrapper[4841]: E0313 09:15:25.997794 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j5szf_openshift-ovn-kubernetes(db500a1d-2be8-49c1-9c9e-af7623d16b15)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" Mar 13 09:15:27 crc kubenswrapper[4841]: I0313 09:15:27.994187 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:27 crc kubenswrapper[4841]: I0313 09:15:27.994218 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:27 crc kubenswrapper[4841]: I0313 09:15:27.994320 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:27 crc kubenswrapper[4841]: I0313 09:15:27.994256 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:27 crc kubenswrapper[4841]: E0313 09:15:27.994501 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:27 crc kubenswrapper[4841]: E0313 09:15:27.997428 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:27 crc kubenswrapper[4841]: E0313 09:15:27.997530 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:27 crc kubenswrapper[4841]: E0313 09:15:27.997639 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:28 crc kubenswrapper[4841]: E0313 09:15:28.117597 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:15:29 crc kubenswrapper[4841]: I0313 09:15:29.994064 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:29 crc kubenswrapper[4841]: I0313 09:15:29.994129 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:29 crc kubenswrapper[4841]: I0313 09:15:29.994085 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:29 crc kubenswrapper[4841]: E0313 09:15:29.994188 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:29 crc kubenswrapper[4841]: I0313 09:15:29.994062 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:29 crc kubenswrapper[4841]: E0313 09:15:29.994383 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:29 crc kubenswrapper[4841]: E0313 09:15:29.994470 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:29 crc kubenswrapper[4841]: E0313 09:15:29.994572 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:30 crc kubenswrapper[4841]: I0313 09:15:30.276588 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:15:31 crc kubenswrapper[4841]: I0313 09:15:31.994128 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:31 crc kubenswrapper[4841]: E0313 09:15:31.994317 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:31 crc kubenswrapper[4841]: I0313 09:15:31.994580 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:31 crc kubenswrapper[4841]: I0313 09:15:31.994633 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:31 crc kubenswrapper[4841]: I0313 09:15:31.994684 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:31 crc kubenswrapper[4841]: E0313 09:15:31.994768 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:31 crc kubenswrapper[4841]: E0313 09:15:31.994900 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:31 crc kubenswrapper[4841]: E0313 09:15:31.995010 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:33 crc kubenswrapper[4841]: E0313 09:15:33.119398 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:15:33 crc kubenswrapper[4841]: I0313 09:15:33.994659 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:33 crc kubenswrapper[4841]: E0313 09:15:33.994827 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:33 crc kubenswrapper[4841]: I0313 09:15:33.995017 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:33 crc kubenswrapper[4841]: E0313 09:15:33.995104 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:33 crc kubenswrapper[4841]: I0313 09:15:33.995149 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:33 crc kubenswrapper[4841]: E0313 09:15:33.995224 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:33 crc kubenswrapper[4841]: I0313 09:15:33.995299 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:33 crc kubenswrapper[4841]: E0313 09:15:33.995387 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:35 crc kubenswrapper[4841]: I0313 09:15:35.994634 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:35 crc kubenswrapper[4841]: I0313 09:15:35.994725 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:35 crc kubenswrapper[4841]: E0313 09:15:35.994879 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:35 crc kubenswrapper[4841]: I0313 09:15:35.994946 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:35 crc kubenswrapper[4841]: E0313 09:15:35.995103 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:35 crc kubenswrapper[4841]: I0313 09:15:35.995180 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:35 crc kubenswrapper[4841]: E0313 09:15:35.995307 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:35 crc kubenswrapper[4841]: E0313 09:15:35.995404 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:37 crc kubenswrapper[4841]: I0313 09:15:37.995614 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:37 crc kubenswrapper[4841]: I0313 09:15:37.995701 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:37 crc kubenswrapper[4841]: I0313 09:15:37.995766 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:37 crc kubenswrapper[4841]: I0313 09:15:37.995817 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:37 crc kubenswrapper[4841]: E0313 09:15:37.996526 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:37 crc kubenswrapper[4841]: E0313 09:15:37.996664 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:37 crc kubenswrapper[4841]: E0313 09:15:37.996827 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:37 crc kubenswrapper[4841]: E0313 09:15:37.997053 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:38 crc kubenswrapper[4841]: E0313 09:15:38.119849 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:15:38 crc kubenswrapper[4841]: I0313 09:15:38.568499 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qkpgl_5978189d-b3a2-408c-b09e-c2b3de0a91b0/kube-multus/1.log" Mar 13 09:15:38 crc kubenswrapper[4841]: I0313 09:15:38.569047 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qkpgl_5978189d-b3a2-408c-b09e-c2b3de0a91b0/kube-multus/0.log" Mar 13 09:15:38 crc kubenswrapper[4841]: I0313 09:15:38.569104 4841 generic.go:334] "Generic (PLEG): container finished" podID="5978189d-b3a2-408c-b09e-c2b3de0a91b0" containerID="1c399741919b7f6d151285e7dc51f48764336f17ec82afe8c58cb97bcbf9e4d9" exitCode=1 Mar 13 09:15:38 crc kubenswrapper[4841]: I0313 09:15:38.569140 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qkpgl" event={"ID":"5978189d-b3a2-408c-b09e-c2b3de0a91b0","Type":"ContainerDied","Data":"1c399741919b7f6d151285e7dc51f48764336f17ec82afe8c58cb97bcbf9e4d9"} Mar 13 09:15:38 crc kubenswrapper[4841]: I0313 09:15:38.569390 4841 scope.go:117] "RemoveContainer" containerID="efa94f60f2675f2f9c870db61410aa5d2e35a92f9a603aa9aea80fcbd93cb8f7" Mar 13 09:15:38 crc kubenswrapper[4841]: I0313 09:15:38.569889 4841 scope.go:117] "RemoveContainer" containerID="1c399741919b7f6d151285e7dc51f48764336f17ec82afe8c58cb97bcbf9e4d9" Mar 13 09:15:38 crc kubenswrapper[4841]: E0313 09:15:38.570079 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-qkpgl_openshift-multus(5978189d-b3a2-408c-b09e-c2b3de0a91b0)\"" pod="openshift-multus/multus-qkpgl" podUID="5978189d-b3a2-408c-b09e-c2b3de0a91b0" Mar 13 09:15:39 crc kubenswrapper[4841]: I0313 09:15:39.576342 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qkpgl_5978189d-b3a2-408c-b09e-c2b3de0a91b0/kube-multus/1.log" Mar 13 09:15:39 crc kubenswrapper[4841]: I0313 09:15:39.994680 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:39 crc kubenswrapper[4841]: I0313 09:15:39.994727 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:39 crc kubenswrapper[4841]: I0313 09:15:39.995029 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:39 crc kubenswrapper[4841]: E0313 09:15:39.994993 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:39 crc kubenswrapper[4841]: I0313 09:15:39.995143 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:39 crc kubenswrapper[4841]: E0313 09:15:39.995573 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:39 crc kubenswrapper[4841]: E0313 09:15:39.995798 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:39 crc kubenswrapper[4841]: I0313 09:15:39.995833 4841 scope.go:117] "RemoveContainer" containerID="46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf" Mar 13 09:15:39 crc kubenswrapper[4841]: E0313 09:15:39.995902 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:40 crc kubenswrapper[4841]: I0313 09:15:40.582708 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovnkube-controller/3.log" Mar 13 09:15:40 crc kubenswrapper[4841]: I0313 09:15:40.586073 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerStarted","Data":"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a"} Mar 13 09:15:40 crc kubenswrapper[4841]: I0313 09:15:40.586511 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:15:40 crc kubenswrapper[4841]: I0313 09:15:40.610935 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" podStartSLOduration=153.61092033 podStartE2EDuration="2m33.61092033s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:15:40.610358322 +0000 UTC m=+223.340258523" watchObservedRunningTime="2026-03-13 09:15:40.61092033 +0000 UTC m=+223.340820521" Mar 13 09:15:40 crc kubenswrapper[4841]: I0313 09:15:40.995558 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5t7sg"] Mar 13 09:15:40 crc kubenswrapper[4841]: I0313 09:15:40.995663 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:40 crc kubenswrapper[4841]: E0313 09:15:40.995741 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:41 crc kubenswrapper[4841]: I0313 09:15:41.995117 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:41 crc kubenswrapper[4841]: I0313 09:15:41.995151 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:41 crc kubenswrapper[4841]: I0313 09:15:41.995330 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:41 crc kubenswrapper[4841]: E0313 09:15:41.995453 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:41 crc kubenswrapper[4841]: E0313 09:15:41.995967 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:41 crc kubenswrapper[4841]: E0313 09:15:41.996208 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:42 crc kubenswrapper[4841]: I0313 09:15:42.995017 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:42 crc kubenswrapper[4841]: E0313 09:15:42.995758 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:43 crc kubenswrapper[4841]: E0313 09:15:43.121311 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:15:43 crc kubenswrapper[4841]: I0313 09:15:43.994736 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:43 crc kubenswrapper[4841]: I0313 09:15:43.994822 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:43 crc kubenswrapper[4841]: E0313 09:15:43.994923 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:43 crc kubenswrapper[4841]: I0313 09:15:43.995031 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:43 crc kubenswrapper[4841]: E0313 09:15:43.995224 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:43 crc kubenswrapper[4841]: E0313 09:15:43.995367 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:44 crc kubenswrapper[4841]: I0313 09:15:44.994880 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:44 crc kubenswrapper[4841]: E0313 09:15:44.995069 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:45 crc kubenswrapper[4841]: I0313 09:15:45.994791 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:45 crc kubenswrapper[4841]: I0313 09:15:45.994796 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:45 crc kubenswrapper[4841]: E0313 09:15:45.995012 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:45 crc kubenswrapper[4841]: E0313 09:15:45.995184 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:45 crc kubenswrapper[4841]: I0313 09:15:45.994830 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:45 crc kubenswrapper[4841]: E0313 09:15:45.995441 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:46 crc kubenswrapper[4841]: I0313 09:15:46.994645 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:46 crc kubenswrapper[4841]: E0313 09:15:46.994791 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:47 crc kubenswrapper[4841]: I0313 09:15:47.994790 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:47 crc kubenswrapper[4841]: E0313 09:15:47.997443 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:47 crc kubenswrapper[4841]: I0313 09:15:47.997478 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:47 crc kubenswrapper[4841]: I0313 09:15:47.997565 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:47 crc kubenswrapper[4841]: E0313 09:15:47.997630 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:47 crc kubenswrapper[4841]: E0313 09:15:47.997814 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:48 crc kubenswrapper[4841]: E0313 09:15:48.122018 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 09:15:48 crc kubenswrapper[4841]: I0313 09:15:48.994022 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:48 crc kubenswrapper[4841]: E0313 09:15:48.994424 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:49 crc kubenswrapper[4841]: I0313 09:15:49.994384 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:49 crc kubenswrapper[4841]: I0313 09:15:49.994792 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:49 crc kubenswrapper[4841]: E0313 09:15:49.994951 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:49 crc kubenswrapper[4841]: E0313 09:15:49.995092 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:49 crc kubenswrapper[4841]: I0313 09:15:49.995578 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:49 crc kubenswrapper[4841]: E0313 09:15:49.995742 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:49 crc kubenswrapper[4841]: I0313 09:15:49.995880 4841 scope.go:117] "RemoveContainer" containerID="1c399741919b7f6d151285e7dc51f48764336f17ec82afe8c58cb97bcbf9e4d9" Mar 13 09:15:50 crc kubenswrapper[4841]: I0313 09:15:50.624744 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qkpgl_5978189d-b3a2-408c-b09e-c2b3de0a91b0/kube-multus/1.log" Mar 13 09:15:50 crc kubenswrapper[4841]: I0313 09:15:50.625214 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qkpgl" event={"ID":"5978189d-b3a2-408c-b09e-c2b3de0a91b0","Type":"ContainerStarted","Data":"ad9a321f2ded40f220538e5e218457fdd4cf438c48ad250ba23f7419284d7970"} Mar 13 09:15:50 crc kubenswrapper[4841]: I0313 09:15:50.994900 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:50 crc kubenswrapper[4841]: E0313 09:15:50.995042 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:51 crc kubenswrapper[4841]: I0313 09:15:51.994395 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:51 crc kubenswrapper[4841]: I0313 09:15:51.994462 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:51 crc kubenswrapper[4841]: E0313 09:15:51.994592 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 09:15:51 crc kubenswrapper[4841]: I0313 09:15:51.994633 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:51 crc kubenswrapper[4841]: E0313 09:15:51.994724 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 09:15:51 crc kubenswrapper[4841]: E0313 09:15:51.994895 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 09:15:52 crc kubenswrapper[4841]: I0313 09:15:52.994899 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:52 crc kubenswrapper[4841]: E0313 09:15:52.995086 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5t7sg" podUID="ea07a392-2a1f-4bae-bb67-db7cd421c1e1" Mar 13 09:15:53 crc kubenswrapper[4841]: I0313 09:15:53.994539 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:15:53 crc kubenswrapper[4841]: I0313 09:15:53.994552 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:15:53 crc kubenswrapper[4841]: I0313 09:15:53.994737 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:15:53 crc kubenswrapper[4841]: I0313 09:15:53.997092 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 09:15:53 crc kubenswrapper[4841]: I0313 09:15:53.997388 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 09:15:53 crc kubenswrapper[4841]: I0313 09:15:53.998293 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 09:15:53 crc kubenswrapper[4841]: I0313 09:15:53.998525 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 09:15:54 crc kubenswrapper[4841]: I0313 09:15:54.993973 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:15:54 crc kubenswrapper[4841]: I0313 09:15:54.997366 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 09:15:54 crc kubenswrapper[4841]: I0313 09:15:54.997928 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 09:15:58 crc kubenswrapper[4841]: I0313 09:15:58.996206 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.045786 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.046224 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.050726 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.050983 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2sr9r"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.051489 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2sr9r" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.052696 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nl9wf"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.053705 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.054010 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.054771 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.055170 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.055637 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.057607 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.057969 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.061354 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-86p7z"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.061968 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.062458 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.062975 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.064583 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qtpx7"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.064899 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qtpx7" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.066554 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wxvhj"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.067536 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.069714 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zsbmr"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.070625 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.071600 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.071994 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.075297 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.076058 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.080430 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.081121 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.081699 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.082776 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.083682 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.083932 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.084164 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.084531 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.084749 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.085021 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.085245 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.085558 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-69l88"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.086041 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.086094 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.086131 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.086392 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-69l88" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.086600 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.086973 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.087571 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.091576 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.091882 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.092344 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.092590 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.092819 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.092957 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.093054 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.093088 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.093416 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.093457 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.093484 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.093568 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.093615 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.093719 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.093863 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.093998 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.094114 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.094376 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.094783 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-v6vhc"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.097109 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.104550 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vjcp6"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.106430 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qlg79"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.106552 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.107154 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qlg79" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.107370 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vjcp6" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.135973 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.136207 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.136281 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.136371 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.136506 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.136641 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.136824 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.136891 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.136960 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.137039 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.137078 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.137175 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.137200 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.137325 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.137393 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.137328 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.149673 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.151467 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.151782 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.151952 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.155200 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hslbs"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.159458 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162483 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-audit-policies\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162516 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162534 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-etcd-client\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162551 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa351e74-995f-4273-b0c6-ed8f57932d7b-config\") pod \"machine-approver-56656f9798-6bfpp\" (UID: \"fa351e74-995f-4273-b0c6-ed8f57932d7b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162566 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53f88379-f3fd-4035-889f-2ff40f5bd9cc-service-ca-bundle\") pod \"authentication-operator-69f744f599-wxvhj\" (UID: \"53f88379-f3fd-4035-889f-2ff40f5bd9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162581 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-serving-cert\") pod \"route-controller-manager-6576b87f9c-zw4tg\" (UID: \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162598 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d40b190-4e17-46d9-85f7-f4062ea2fc47-client-ca\") pod \"controller-manager-879f6c89f-86p7z\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162612 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162627 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qzkh\" (UniqueName: \"kubernetes.io/projected/3afaa466-07d9-4168-a4bc-4af2c328f8fe-kube-api-access-6qzkh\") pod \"console-operator-58897d9998-2sr9r\" (UID: \"3afaa466-07d9-4168-a4bc-4af2c328f8fe\") " pod="openshift-console-operator/console-operator-58897d9998-2sr9r" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162650 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162667 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv7jv\" (UniqueName: \"kubernetes.io/projected/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-kube-api-access-zv7jv\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162682 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162698 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-encryption-config\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162713 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d40b190-4e17-46d9-85f7-f4062ea2fc47-config\") pod \"controller-manager-879f6c89f-86p7z\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162728 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-node-pullsecrets\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162744 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3afaa466-07d9-4168-a4bc-4af2c328f8fe-trusted-ca\") pod \"console-operator-58897d9998-2sr9r\" (UID: \"3afaa466-07d9-4168-a4bc-4af2c328f8fe\") " pod="openshift-console-operator/console-operator-58897d9998-2sr9r" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162767 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162782 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162796 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-audit-policies\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162811 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvj6z\" (UniqueName: \"kubernetes.io/projected/63754623-7341-47a8-8e37-e069a35cacd4-kube-api-access-mvj6z\") pod \"openshift-controller-manager-operator-756b6f6bc6-qtpx7\" (UID: \"63754623-7341-47a8-8e37-e069a35cacd4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qtpx7" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162825 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-config\") pod \"route-controller-manager-6576b87f9c-zw4tg\" (UID: \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162839 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5knd\" (UniqueName: \"kubernetes.io/projected/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-kube-api-access-n5knd\") pod \"route-controller-manager-6576b87f9c-zw4tg\" (UID: \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162853 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3afaa466-07d9-4168-a4bc-4af2c328f8fe-serving-cert\") pod \"console-operator-58897d9998-2sr9r\" (UID: \"3afaa466-07d9-4168-a4bc-4af2c328f8fe\") " pod="openshift-console-operator/console-operator-58897d9998-2sr9r" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162868 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-etcd-client\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162882 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fa351e74-995f-4273-b0c6-ed8f57932d7b-auth-proxy-config\") pod \"machine-approver-56656f9798-6bfpp\" (UID: \"fa351e74-995f-4273-b0c6-ed8f57932d7b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162896 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24dm2\" (UniqueName: \"kubernetes.io/projected/4d40b190-4e17-46d9-85f7-f4062ea2fc47-kube-api-access-24dm2\") pod \"controller-manager-879f6c89f-86p7z\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162911 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-etcd-serving-ca\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162924 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162940 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-encryption-config\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162954 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nm6w\" (UniqueName: \"kubernetes.io/projected/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-kube-api-access-5nm6w\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162969 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.162985 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d40b190-4e17-46d9-85f7-f4062ea2fc47-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-86p7z\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163000 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63754623-7341-47a8-8e37-e069a35cacd4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qtpx7\" (UID: \"63754623-7341-47a8-8e37-e069a35cacd4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qtpx7" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163014 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53f88379-f3fd-4035-889f-2ff40f5bd9cc-serving-cert\") pod \"authentication-operator-69f744f599-wxvhj\" (UID: \"53f88379-f3fd-4035-889f-2ff40f5bd9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163027 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-serving-cert\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163041 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-audit-dir\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163068 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/293f3a0e-401d-440f-8321-0aac18b90219-audit-dir\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163083 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163125 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5wnj\" (UniqueName: \"kubernetes.io/projected/53f88379-f3fd-4035-889f-2ff40f5bd9cc-kube-api-access-j5wnj\") pod \"authentication-operator-69f744f599-wxvhj\" (UID: \"53f88379-f3fd-4035-889f-2ff40f5bd9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163142 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-serving-cert\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163155 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-audit\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163169 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3afaa466-07d9-4168-a4bc-4af2c328f8fe-config\") pod \"console-operator-58897d9998-2sr9r\" (UID: \"3afaa466-07d9-4168-a4bc-4af2c328f8fe\") " pod="openshift-console-operator/console-operator-58897d9998-2sr9r" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163183 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163197 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163211 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fa351e74-995f-4273-b0c6-ed8f57932d7b-machine-approver-tls\") pod \"machine-approver-56656f9798-6bfpp\" (UID: \"fa351e74-995f-4273-b0c6-ed8f57932d7b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163226 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63754623-7341-47a8-8e37-e069a35cacd4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qtpx7\" (UID: \"63754623-7341-47a8-8e37-e069a35cacd4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qtpx7" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163245 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-audit-dir\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163276 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163292 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-client-ca\") pod \"route-controller-manager-6576b87f9c-zw4tg\" (UID: \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163306 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53f88379-f3fd-4035-889f-2ff40f5bd9cc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wxvhj\" (UID: \"53f88379-f3fd-4035-889f-2ff40f5bd9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163322 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-config\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163335 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53f88379-f3fd-4035-889f-2ff40f5bd9cc-config\") pod \"authentication-operator-69f744f599-wxvhj\" (UID: \"53f88379-f3fd-4035-889f-2ff40f5bd9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163349 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d40b190-4e17-46d9-85f7-f4062ea2fc47-serving-cert\") pod \"controller-manager-879f6c89f-86p7z\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163363 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163379 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkgdc\" (UniqueName: \"kubernetes.io/projected/293f3a0e-401d-440f-8321-0aac18b90219-kube-api-access-fkgdc\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163394 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-image-import-ca\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163410 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163432 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5w4j\" (UniqueName: \"kubernetes.io/projected/fa351e74-995f-4273-b0c6-ed8f57932d7b-kube-api-access-p5w4j\") pod \"machine-approver-56656f9798-6bfpp\" (UID: \"fa351e74-995f-4273-b0c6-ed8f57932d7b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163544 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-lrsts"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163837 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.163949 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mhstq"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.164058 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.164304 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rqcwd"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.164562 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-lrsts" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.164659 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vgm4n"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.165065 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mhstq" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.165071 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vgm4n" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.165427 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.165441 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rqcwd" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.165586 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.165655 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.165738 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.166023 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.166592 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.167584 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.169123 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.169521 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.169680 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.169833 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.173193 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4rfp"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.174901 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.174938 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zdgw8"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.175081 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.175544 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.175716 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.175927 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-zdgw8" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.176392 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4rfp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.176419 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.176503 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.176624 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.176782 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.177199 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.177973 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.178150 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.179241 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.179662 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.183664 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.184018 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.184544 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.186159 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.186687 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.187148 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.187355 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.187526 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.187637 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.187760 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.187876 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.189125 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qcfvl"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.189908 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.189976 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wh6kv"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.190431 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wh6kv" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.190594 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qcfvl" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.216348 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6t6zf"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.217216 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.217771 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.218043 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.218052 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.220686 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.226118 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.227508 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6cdpl"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.228647 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.228867 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cdpl" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.232709 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xxwmv"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.244191 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xxwmv" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.244953 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-8kvzk"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.245379 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.246527 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556554-2dgrd"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.246997 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556554-2dgrd" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.247830 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.248640 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.248919 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mpnlt"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.249301 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.250518 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.251534 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.251951 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bvfhj"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.252252 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.252381 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.252506 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bvfhj" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.252715 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6lggf"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.254282 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6lggf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.254406 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rm4kk"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.255213 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rm4kk" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.256938 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.257383 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.260257 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.262593 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qtpx7"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.262667 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5rn52"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.262709 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.263255 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rn52" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264160 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d40b190-4e17-46d9-85f7-f4062ea2fc47-client-ca\") pod \"controller-manager-879f6c89f-86p7z\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264213 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264247 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d322779-6593-48b0-b504-792c5ae3b7a6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rqcwd\" (UID: \"8d322779-6593-48b0-b504-792c5ae3b7a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rqcwd" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264294 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qzkh\" (UniqueName: \"kubernetes.io/projected/3afaa466-07d9-4168-a4bc-4af2c328f8fe-kube-api-access-6qzkh\") pod \"console-operator-58897d9998-2sr9r\" (UID: \"3afaa466-07d9-4168-a4bc-4af2c328f8fe\") " pod="openshift-console-operator/console-operator-58897d9998-2sr9r" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264322 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lstx\" (UniqueName: \"kubernetes.io/projected/2151dc45-23c0-44b3-b896-0915da9f9d59-kube-api-access-5lstx\") pod \"multus-admission-controller-857f4d67dd-zdgw8\" (UID: \"2151dc45-23c0-44b3-b896-0915da9f9d59\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zdgw8" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264350 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxgqk\" (UniqueName: \"kubernetes.io/projected/747a24f9-e654-421a-8da1-0be0aa6ccd9b-kube-api-access-gxgqk\") pod \"machine-api-operator-5694c8668f-qlg79\" (UID: \"747a24f9-e654-421a-8da1-0be0aa6ccd9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlg79" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264378 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264404 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhx8x\" (UniqueName: \"kubernetes.io/projected/f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa-kube-api-access-jhx8x\") pod \"kube-storage-version-migrator-operator-b67b599dd-vgm4n\" (UID: \"f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vgm4n" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264429 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d322779-6593-48b0-b504-792c5ae3b7a6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rqcwd\" (UID: \"8d322779-6593-48b0-b504-792c5ae3b7a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rqcwd" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264457 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv7jv\" (UniqueName: \"kubernetes.io/projected/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-kube-api-access-zv7jv\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264482 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e98c46b0-e56f-4a29-a194-a39fc2401cfa-etcd-ca\") pod \"etcd-operator-b45778765-hslbs\" (UID: \"e98c46b0-e56f-4a29-a194-a39fc2401cfa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264506 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f83ab80b-b5bd-4274-99b9-b364923326bf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-q4rfp\" (UID: \"f83ab80b-b5bd-4274-99b9-b364923326bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4rfp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264531 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264556 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d40b190-4e17-46d9-85f7-f4062ea2fc47-config\") pod \"controller-manager-879f6c89f-86p7z\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264580 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-node-pullsecrets\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264606 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-encryption-config\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264629 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f83ab80b-b5bd-4274-99b9-b364923326bf-config\") pod \"kube-apiserver-operator-766d6c64bb-q4rfp\" (UID: \"f83ab80b-b5bd-4274-99b9-b364923326bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4rfp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264661 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3afaa466-07d9-4168-a4bc-4af2c328f8fe-trusted-ca\") pod \"console-operator-58897d9998-2sr9r\" (UID: \"3afaa466-07d9-4168-a4bc-4af2c328f8fe\") " pod="openshift-console-operator/console-operator-58897d9998-2sr9r" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264699 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e98c46b0-e56f-4a29-a194-a39fc2401cfa-config\") pod \"etcd-operator-b45778765-hslbs\" (UID: \"e98c46b0-e56f-4a29-a194-a39fc2401cfa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264727 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a38e6005-63a6-4733-a269-f355c648fed4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-69l88\" (UID: \"a38e6005-63a6-4733-a269-f355c648fed4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-69l88" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264754 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264783 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvj6z\" (UniqueName: \"kubernetes.io/projected/63754623-7341-47a8-8e37-e069a35cacd4-kube-api-access-mvj6z\") pod \"openshift-controller-manager-operator-756b6f6bc6-qtpx7\" (UID: \"63754623-7341-47a8-8e37-e069a35cacd4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qtpx7" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264808 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264831 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-audit-policies\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264859 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-console-config\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264886 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-config\") pod \"route-controller-manager-6576b87f9c-zw4tg\" (UID: \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264912 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a38e6005-63a6-4733-a269-f355c648fed4-serving-cert\") pod \"openshift-config-operator-7777fb866f-69l88\" (UID: \"a38e6005-63a6-4733-a269-f355c648fed4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-69l88" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264938 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3afaa466-07d9-4168-a4bc-4af2c328f8fe-serving-cert\") pod \"console-operator-58897d9998-2sr9r\" (UID: \"3afaa466-07d9-4168-a4bc-4af2c328f8fe\") " pod="openshift-console-operator/console-operator-58897d9998-2sr9r" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264960 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-service-ca\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.264984 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m29zc\" (UniqueName: \"kubernetes.io/projected/60193856-8a3f-4ce0-b79d-44e58de19b06-kube-api-access-m29zc\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265011 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5knd\" (UniqueName: \"kubernetes.io/projected/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-kube-api-access-n5knd\") pod \"route-controller-manager-6576b87f9c-zw4tg\" (UID: \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265034 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wxvhj"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265036 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-etcd-client\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265104 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fa351e74-995f-4273-b0c6-ed8f57932d7b-auth-proxy-config\") pod \"machine-approver-56656f9798-6bfpp\" (UID: \"fa351e74-995f-4273-b0c6-ed8f57932d7b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265128 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2151dc45-23c0-44b3-b896-0915da9f9d59-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zdgw8\" (UID: \"2151dc45-23c0-44b3-b896-0915da9f9d59\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zdgw8" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265148 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e98c46b0-e56f-4a29-a194-a39fc2401cfa-serving-cert\") pod \"etcd-operator-b45778765-hslbs\" (UID: \"e98c46b0-e56f-4a29-a194-a39fc2401cfa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265164 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d27990e2-0de9-4cb5-a162-5949047f5d93-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mhstq\" (UID: \"d27990e2-0de9-4cb5-a162-5949047f5d93\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mhstq" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265183 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d322779-6593-48b0-b504-792c5ae3b7a6-config\") pod \"kube-controller-manager-operator-78b949d7b-rqcwd\" (UID: \"8d322779-6593-48b0-b504-792c5ae3b7a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rqcwd" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265202 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24dm2\" (UniqueName: \"kubernetes.io/projected/4d40b190-4e17-46d9-85f7-f4062ea2fc47-kube-api-access-24dm2\") pod \"controller-manager-879f6c89f-86p7z\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265256 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-etcd-serving-ca\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265342 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265360 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vgm4n\" (UID: \"f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vgm4n" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265376 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d27990e2-0de9-4cb5-a162-5949047f5d93-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mhstq\" (UID: \"d27990e2-0de9-4cb5-a162-5949047f5d93\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mhstq" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265400 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-encryption-config\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265422 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265441 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nm6w\" (UniqueName: \"kubernetes.io/projected/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-kube-api-access-5nm6w\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265465 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d40b190-4e17-46d9-85f7-f4062ea2fc47-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-86p7z\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265485 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53f88379-f3fd-4035-889f-2ff40f5bd9cc-serving-cert\") pod \"authentication-operator-69f744f599-wxvhj\" (UID: \"53f88379-f3fd-4035-889f-2ff40f5bd9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265502 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e98c46b0-e56f-4a29-a194-a39fc2401cfa-etcd-client\") pod \"etcd-operator-b45778765-hslbs\" (UID: \"e98c46b0-e56f-4a29-a194-a39fc2401cfa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265519 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63754623-7341-47a8-8e37-e069a35cacd4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qtpx7\" (UID: \"63754623-7341-47a8-8e37-e069a35cacd4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qtpx7" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265539 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-serving-cert\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265556 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-audit-dir\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265575 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9shz\" (UniqueName: \"kubernetes.io/projected/a38e6005-63a6-4733-a269-f355c648fed4-kube-api-access-q9shz\") pod \"openshift-config-operator-7777fb866f-69l88\" (UID: \"a38e6005-63a6-4733-a269-f355c648fed4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-69l88" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265617 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/293f3a0e-401d-440f-8321-0aac18b90219-audit-dir\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265632 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265651 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5wnj\" (UniqueName: \"kubernetes.io/projected/53f88379-f3fd-4035-889f-2ff40f5bd9cc-kube-api-access-j5wnj\") pod \"authentication-operator-69f744f599-wxvhj\" (UID: \"53f88379-f3fd-4035-889f-2ff40f5bd9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265668 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-serving-cert\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265684 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-audit\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265702 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3afaa466-07d9-4168-a4bc-4af2c328f8fe-config\") pod \"console-operator-58897d9998-2sr9r\" (UID: \"3afaa466-07d9-4168-a4bc-4af2c328f8fe\") " pod="openshift-console-operator/console-operator-58897d9998-2sr9r" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265719 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63754623-7341-47a8-8e37-e069a35cacd4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qtpx7\" (UID: \"63754623-7341-47a8-8e37-e069a35cacd4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qtpx7" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265737 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265754 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265768 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fa351e74-995f-4273-b0c6-ed8f57932d7b-machine-approver-tls\") pod \"machine-approver-56656f9798-6bfpp\" (UID: \"fa351e74-995f-4273-b0c6-ed8f57932d7b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265788 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-trusted-ca-bundle\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265803 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/747a24f9-e654-421a-8da1-0be0aa6ccd9b-images\") pod \"machine-api-operator-5694c8668f-qlg79\" (UID: \"747a24f9-e654-421a-8da1-0be0aa6ccd9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlg79" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265831 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-audit-dir\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265849 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a0b7635-6423-4716-93b4-c8e9b3012e55-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vjcp6\" (UID: \"7a0b7635-6423-4716-93b4-c8e9b3012e55\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vjcp6" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265864 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66d67\" (UniqueName: \"kubernetes.io/projected/d27990e2-0de9-4cb5-a162-5949047f5d93-kube-api-access-66d67\") pod \"openshift-apiserver-operator-796bbdcf4f-mhstq\" (UID: \"d27990e2-0de9-4cb5-a162-5949047f5d93\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mhstq" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265888 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265905 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/60193856-8a3f-4ce0-b79d-44e58de19b06-console-oauth-config\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265924 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gt7n\" (UniqueName: \"kubernetes.io/projected/e98c46b0-e56f-4a29-a194-a39fc2401cfa-kube-api-access-9gt7n\") pod \"etcd-operator-b45778765-hslbs\" (UID: \"e98c46b0-e56f-4a29-a194-a39fc2401cfa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265942 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-client-ca\") pod \"route-controller-manager-6576b87f9c-zw4tg\" (UID: \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265958 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53f88379-f3fd-4035-889f-2ff40f5bd9cc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wxvhj\" (UID: \"53f88379-f3fd-4035-889f-2ff40f5bd9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265973 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/60193856-8a3f-4ce0-b79d-44e58de19b06-console-serving-cert\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.265990 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/747a24f9-e654-421a-8da1-0be0aa6ccd9b-config\") pod \"machine-api-operator-5694c8668f-qlg79\" (UID: \"747a24f9-e654-421a-8da1-0be0aa6ccd9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlg79" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266011 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-config\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266028 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53f88379-f3fd-4035-889f-2ff40f5bd9cc-config\") pod \"authentication-operator-69f744f599-wxvhj\" (UID: \"53f88379-f3fd-4035-889f-2ff40f5bd9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266044 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vgm4n\" (UID: \"f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vgm4n" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266063 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d40b190-4e17-46d9-85f7-f4062ea2fc47-serving-cert\") pod \"controller-manager-879f6c89f-86p7z\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266078 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266093 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkgdc\" (UniqueName: \"kubernetes.io/projected/293f3a0e-401d-440f-8321-0aac18b90219-kube-api-access-fkgdc\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266109 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-image-import-ca\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266124 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5w4j\" (UniqueName: \"kubernetes.io/projected/fa351e74-995f-4273-b0c6-ed8f57932d7b-kube-api-access-p5w4j\") pod \"machine-approver-56656f9798-6bfpp\" (UID: \"fa351e74-995f-4273-b0c6-ed8f57932d7b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266139 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266155 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e98c46b0-e56f-4a29-a194-a39fc2401cfa-etcd-service-ca\") pod \"etcd-operator-b45778765-hslbs\" (UID: \"e98c46b0-e56f-4a29-a194-a39fc2401cfa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266170 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5dkc\" (UniqueName: \"kubernetes.io/projected/ede1a06d-6caf-41ae-acfa-2335821a2e0e-kube-api-access-z5dkc\") pod \"downloads-7954f5f757-lrsts\" (UID: \"ede1a06d-6caf-41ae-acfa-2335821a2e0e\") " pod="openshift-console/downloads-7954f5f757-lrsts" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266167 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fa351e74-995f-4273-b0c6-ed8f57932d7b-auth-proxy-config\") pod \"machine-approver-56656f9798-6bfpp\" (UID: \"fa351e74-995f-4273-b0c6-ed8f57932d7b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266185 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f83ab80b-b5bd-4274-99b9-b364923326bf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-q4rfp\" (UID: \"f83ab80b-b5bd-4274-99b9-b364923326bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4rfp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266250 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-audit-policies\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266286 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266309 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa351e74-995f-4273-b0c6-ed8f57932d7b-config\") pod \"machine-approver-56656f9798-6bfpp\" (UID: \"fa351e74-995f-4273-b0c6-ed8f57932d7b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266326 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53f88379-f3fd-4035-889f-2ff40f5bd9cc-service-ca-bundle\") pod \"authentication-operator-69f744f599-wxvhj\" (UID: \"53f88379-f3fd-4035-889f-2ff40f5bd9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266346 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-oauth-serving-cert\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266365 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/747a24f9-e654-421a-8da1-0be0aa6ccd9b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qlg79\" (UID: \"747a24f9-e654-421a-8da1-0be0aa6ccd9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlg79" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266383 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4tl2\" (UniqueName: \"kubernetes.io/projected/7a0b7635-6423-4716-93b4-c8e9b3012e55-kube-api-access-s4tl2\") pod \"cluster-samples-operator-665b6dd947-vjcp6\" (UID: \"7a0b7635-6423-4716-93b4-c8e9b3012e55\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vjcp6" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266401 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-etcd-client\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266421 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-serving-cert\") pod \"route-controller-manager-6576b87f9c-zw4tg\" (UID: \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.266782 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-etcd-serving-ca\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.267211 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.267729 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-audit-policies\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.268064 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-config\") pod \"route-controller-manager-6576b87f9c-zw4tg\" (UID: \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.269528 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d40b190-4e17-46d9-85f7-f4062ea2fc47-client-ca\") pod \"controller-manager-879f6c89f-86p7z\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.270065 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.270446 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3afaa466-07d9-4168-a4bc-4af2c328f8fe-trusted-ca\") pod \"console-operator-58897d9998-2sr9r\" (UID: \"3afaa466-07d9-4168-a4bc-4af2c328f8fe\") " pod="openshift-console-operator/console-operator-58897d9998-2sr9r" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.271224 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.271371 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-audit-policies\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.271743 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.271994 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.272037 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.272310 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d40b190-4e17-46d9-85f7-f4062ea2fc47-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-86p7z\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.272451 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-encryption-config\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.272780 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3afaa466-07d9-4168-a4bc-4af2c328f8fe-serving-cert\") pod \"console-operator-58897d9998-2sr9r\" (UID: \"3afaa466-07d9-4168-a4bc-4af2c328f8fe\") " pod="openshift-console-operator/console-operator-58897d9998-2sr9r" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.272974 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.273172 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.273574 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa351e74-995f-4273-b0c6-ed8f57932d7b-config\") pod \"machine-approver-56656f9798-6bfpp\" (UID: \"fa351e74-995f-4273-b0c6-ed8f57932d7b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.273818 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-serving-cert\") pod \"route-controller-manager-6576b87f9c-zw4tg\" (UID: \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.274179 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53f88379-f3fd-4035-889f-2ff40f5bd9cc-service-ca-bundle\") pod \"authentication-operator-69f744f599-wxvhj\" (UID: \"53f88379-f3fd-4035-889f-2ff40f5bd9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.274487 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.274534 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-audit-dir\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.274965 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.275069 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.275096 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-client-ca\") pod \"route-controller-manager-6576b87f9c-zw4tg\" (UID: \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.275239 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63754623-7341-47a8-8e37-e069a35cacd4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qtpx7\" (UID: \"63754623-7341-47a8-8e37-e069a35cacd4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qtpx7" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.274966 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.275543 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2sr9r"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.275618 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-etcd-client\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.275760 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53f88379-f3fd-4035-889f-2ff40f5bd9cc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wxvhj\" (UID: \"53f88379-f3fd-4035-889f-2ff40f5bd9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.275850 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-node-pullsecrets\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.275910 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/293f3a0e-401d-440f-8321-0aac18b90219-audit-dir\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.276244 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-config\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.276730 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53f88379-f3fd-4035-889f-2ff40f5bd9cc-config\") pod \"authentication-operator-69f744f599-wxvhj\" (UID: \"53f88379-f3fd-4035-889f-2ff40f5bd9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.276986 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-m4fpx"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.277139 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-audit-dir\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.277418 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-audit\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.277455 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.278813 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3afaa466-07d9-4168-a4bc-4af2c328f8fe-config\") pod \"console-operator-58897d9998-2sr9r\" (UID: \"3afaa466-07d9-4168-a4bc-4af2c328f8fe\") " pod="openshift-console-operator/console-operator-58897d9998-2sr9r" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.278881 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m4fpx" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.279697 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nl9wf"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.280421 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d40b190-4e17-46d9-85f7-f4062ea2fc47-config\") pod \"controller-manager-879f6c89f-86p7z\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.280979 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vjcp6"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.281187 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-image-import-ca\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.282228 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-86p7z"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.283381 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zsbmr"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.286244 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.286255 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-etcd-client\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.286374 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53f88379-f3fd-4035-889f-2ff40f5bd9cc-serving-cert\") pod \"authentication-operator-69f744f599-wxvhj\" (UID: \"53f88379-f3fd-4035-889f-2ff40f5bd9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.286454 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-serving-cert\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.286544 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-encryption-config\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.287034 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63754623-7341-47a8-8e37-e069a35cacd4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qtpx7\" (UID: \"63754623-7341-47a8-8e37-e069a35cacd4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qtpx7" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.287094 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d40b190-4e17-46d9-85f7-f4062ea2fc47-serving-cert\") pod \"controller-manager-879f6c89f-86p7z\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.287493 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fa351e74-995f-4273-b0c6-ed8f57932d7b-machine-approver-tls\") pod \"machine-approver-56656f9798-6bfpp\" (UID: \"fa351e74-995f-4273-b0c6-ed8f57932d7b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.287721 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-serving-cert\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.287745 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.289440 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6cdpl"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.292336 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mhstq"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.292381 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-lrsts"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.293730 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qlg79"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.295918 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zdgw8"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.296399 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.297611 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-69l88"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.299230 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qcfvl"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.300728 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vgm4n"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.302237 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556554-2dgrd"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.303042 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.303432 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.303439 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.303638 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.305599 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.305880 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6t6zf"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.307882 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wh6kv"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.308667 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.310044 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-v6vhc"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.311658 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m4fpx"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.313284 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.314783 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4rfp"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.317235 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rqcwd"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.320354 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.321981 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.326672 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.329050 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.329299 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mpnlt"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.332030 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hslbs"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.334406 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xxwmv"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.335910 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bbjqp"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.337696 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-md725"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.338388 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bbjqp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.339546 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.339613 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.341298 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rm4kk"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.343018 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bbjqp"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.344622 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.346557 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-md725"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.348206 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.348454 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5rn52"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.349927 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6lggf"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.351488 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bvfhj"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.352973 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-dtk79"] Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.353757 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dtk79" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367087 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vgm4n\" (UID: \"f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vgm4n" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367208 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e98c46b0-e56f-4a29-a194-a39fc2401cfa-etcd-service-ca\") pod \"etcd-operator-b45778765-hslbs\" (UID: \"e98c46b0-e56f-4a29-a194-a39fc2401cfa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367334 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5dkc\" (UniqueName: \"kubernetes.io/projected/ede1a06d-6caf-41ae-acfa-2335821a2e0e-kube-api-access-z5dkc\") pod \"downloads-7954f5f757-lrsts\" (UID: \"ede1a06d-6caf-41ae-acfa-2335821a2e0e\") " pod="openshift-console/downloads-7954f5f757-lrsts" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367364 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f83ab80b-b5bd-4274-99b9-b364923326bf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-q4rfp\" (UID: \"f83ab80b-b5bd-4274-99b9-b364923326bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4rfp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367406 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl7s7\" (UniqueName: \"kubernetes.io/projected/adeab6d7-21b6-4ef2-afdb-75854f0914c5-kube-api-access-wl7s7\") pod \"control-plane-machine-set-operator-78cbb6b69f-wh6kv\" (UID: \"adeab6d7-21b6-4ef2-afdb-75854f0914c5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wh6kv" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367433 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-oauth-serving-cert\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367459 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/747a24f9-e654-421a-8da1-0be0aa6ccd9b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qlg79\" (UID: \"747a24f9-e654-421a-8da1-0be0aa6ccd9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlg79" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367497 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4tl2\" (UniqueName: \"kubernetes.io/projected/7a0b7635-6423-4716-93b4-c8e9b3012e55-kube-api-access-s4tl2\") pod \"cluster-samples-operator-665b6dd947-vjcp6\" (UID: \"7a0b7635-6423-4716-93b4-c8e9b3012e55\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vjcp6" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367528 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/adeab6d7-21b6-4ef2-afdb-75854f0914c5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wh6kv\" (UID: \"adeab6d7-21b6-4ef2-afdb-75854f0914c5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wh6kv" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367568 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d322779-6593-48b0-b504-792c5ae3b7a6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rqcwd\" (UID: \"8d322779-6593-48b0-b504-792c5ae3b7a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rqcwd" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367592 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxgqk\" (UniqueName: \"kubernetes.io/projected/747a24f9-e654-421a-8da1-0be0aa6ccd9b-kube-api-access-gxgqk\") pod \"machine-api-operator-5694c8668f-qlg79\" (UID: \"747a24f9-e654-421a-8da1-0be0aa6ccd9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlg79" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367629 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lstx\" (UniqueName: \"kubernetes.io/projected/2151dc45-23c0-44b3-b896-0915da9f9d59-kube-api-access-5lstx\") pod \"multus-admission-controller-857f4d67dd-zdgw8\" (UID: \"2151dc45-23c0-44b3-b896-0915da9f9d59\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zdgw8" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367653 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhx8x\" (UniqueName: \"kubernetes.io/projected/f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa-kube-api-access-jhx8x\") pod \"kube-storage-version-migrator-operator-b67b599dd-vgm4n\" (UID: \"f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vgm4n" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367677 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d322779-6593-48b0-b504-792c5ae3b7a6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rqcwd\" (UID: \"8d322779-6593-48b0-b504-792c5ae3b7a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rqcwd" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367708 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e98c46b0-e56f-4a29-a194-a39fc2401cfa-etcd-ca\") pod \"etcd-operator-b45778765-hslbs\" (UID: \"e98c46b0-e56f-4a29-a194-a39fc2401cfa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367732 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f83ab80b-b5bd-4274-99b9-b364923326bf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-q4rfp\" (UID: \"f83ab80b-b5bd-4274-99b9-b364923326bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4rfp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367768 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f83ab80b-b5bd-4274-99b9-b364923326bf-config\") pod \"kube-apiserver-operator-766d6c64bb-q4rfp\" (UID: \"f83ab80b-b5bd-4274-99b9-b364923326bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4rfp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367813 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a38e6005-63a6-4733-a269-f355c648fed4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-69l88\" (UID: \"a38e6005-63a6-4733-a269-f355c648fed4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-69l88" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367834 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e98c46b0-e56f-4a29-a194-a39fc2401cfa-config\") pod \"etcd-operator-b45778765-hslbs\" (UID: \"e98c46b0-e56f-4a29-a194-a39fc2401cfa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367867 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-console-config\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367895 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a38e6005-63a6-4733-a269-f355c648fed4-serving-cert\") pod \"openshift-config-operator-7777fb866f-69l88\" (UID: \"a38e6005-63a6-4733-a269-f355c648fed4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-69l88" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367925 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-service-ca\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367952 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m29zc\" (UniqueName: \"kubernetes.io/projected/60193856-8a3f-4ce0-b79d-44e58de19b06-kube-api-access-m29zc\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367977 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2151dc45-23c0-44b3-b896-0915da9f9d59-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zdgw8\" (UID: \"2151dc45-23c0-44b3-b896-0915da9f9d59\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zdgw8" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.367998 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e98c46b0-e56f-4a29-a194-a39fc2401cfa-serving-cert\") pod \"etcd-operator-b45778765-hslbs\" (UID: \"e98c46b0-e56f-4a29-a194-a39fc2401cfa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.368021 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d27990e2-0de9-4cb5-a162-5949047f5d93-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mhstq\" (UID: \"d27990e2-0de9-4cb5-a162-5949047f5d93\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mhstq" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.368051 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d322779-6593-48b0-b504-792c5ae3b7a6-config\") pod \"kube-controller-manager-operator-78b949d7b-rqcwd\" (UID: \"8d322779-6593-48b0-b504-792c5ae3b7a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rqcwd" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.368074 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d27990e2-0de9-4cb5-a162-5949047f5d93-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mhstq\" (UID: \"d27990e2-0de9-4cb5-a162-5949047f5d93\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mhstq" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.368097 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vgm4n\" (UID: \"f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vgm4n" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.368147 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e98c46b0-e56f-4a29-a194-a39fc2401cfa-etcd-client\") pod \"etcd-operator-b45778765-hslbs\" (UID: \"e98c46b0-e56f-4a29-a194-a39fc2401cfa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.368171 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9shz\" (UniqueName: \"kubernetes.io/projected/a38e6005-63a6-4733-a269-f355c648fed4-kube-api-access-q9shz\") pod \"openshift-config-operator-7777fb866f-69l88\" (UID: \"a38e6005-63a6-4733-a269-f355c648fed4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-69l88" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.368226 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-trusted-ca-bundle\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.368249 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/747a24f9-e654-421a-8da1-0be0aa6ccd9b-images\") pod \"machine-api-operator-5694c8668f-qlg79\" (UID: \"747a24f9-e654-421a-8da1-0be0aa6ccd9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlg79" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.368293 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66d67\" (UniqueName: \"kubernetes.io/projected/d27990e2-0de9-4cb5-a162-5949047f5d93-kube-api-access-66d67\") pod \"openshift-apiserver-operator-796bbdcf4f-mhstq\" (UID: \"d27990e2-0de9-4cb5-a162-5949047f5d93\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mhstq" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.368329 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a0b7635-6423-4716-93b4-c8e9b3012e55-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vjcp6\" (UID: \"7a0b7635-6423-4716-93b4-c8e9b3012e55\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vjcp6" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.368353 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/60193856-8a3f-4ce0-b79d-44e58de19b06-console-oauth-config\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.368378 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gt7n\" (UniqueName: \"kubernetes.io/projected/e98c46b0-e56f-4a29-a194-a39fc2401cfa-kube-api-access-9gt7n\") pod \"etcd-operator-b45778765-hslbs\" (UID: \"e98c46b0-e56f-4a29-a194-a39fc2401cfa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.368402 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/747a24f9-e654-421a-8da1-0be0aa6ccd9b-config\") pod \"machine-api-operator-5694c8668f-qlg79\" (UID: \"747a24f9-e654-421a-8da1-0be0aa6ccd9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlg79" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.368425 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/60193856-8a3f-4ce0-b79d-44e58de19b06-console-serving-cert\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.368500 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e98c46b0-e56f-4a29-a194-a39fc2401cfa-etcd-service-ca\") pod \"etcd-operator-b45778765-hslbs\" (UID: \"e98c46b0-e56f-4a29-a194-a39fc2401cfa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.368721 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-oauth-serving-cert\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.369631 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-service-ca\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.370063 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a38e6005-63a6-4733-a269-f355c648fed4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-69l88\" (UID: \"a38e6005-63a6-4733-a269-f355c648fed4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-69l88" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.370163 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e98c46b0-e56f-4a29-a194-a39fc2401cfa-etcd-ca\") pod \"etcd-operator-b45778765-hslbs\" (UID: \"e98c46b0-e56f-4a29-a194-a39fc2401cfa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.370305 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.370472 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/747a24f9-e654-421a-8da1-0be0aa6ccd9b-images\") pod \"machine-api-operator-5694c8668f-qlg79\" (UID: \"747a24f9-e654-421a-8da1-0be0aa6ccd9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlg79" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.370700 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e98c46b0-e56f-4a29-a194-a39fc2401cfa-config\") pod \"etcd-operator-b45778765-hslbs\" (UID: \"e98c46b0-e56f-4a29-a194-a39fc2401cfa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.370924 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-trusted-ca-bundle\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.371194 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-console-config\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.371740 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e98c46b0-e56f-4a29-a194-a39fc2401cfa-etcd-client\") pod \"etcd-operator-b45778765-hslbs\" (UID: \"e98c46b0-e56f-4a29-a194-a39fc2401cfa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.372513 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d27990e2-0de9-4cb5-a162-5949047f5d93-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mhstq\" (UID: \"d27990e2-0de9-4cb5-a162-5949047f5d93\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mhstq" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.372671 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/747a24f9-e654-421a-8da1-0be0aa6ccd9b-config\") pod \"machine-api-operator-5694c8668f-qlg79\" (UID: \"747a24f9-e654-421a-8da1-0be0aa6ccd9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlg79" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.372672 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vgm4n\" (UID: \"f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vgm4n" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.373531 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/60193856-8a3f-4ce0-b79d-44e58de19b06-console-serving-cert\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.373980 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/747a24f9-e654-421a-8da1-0be0aa6ccd9b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qlg79\" (UID: \"747a24f9-e654-421a-8da1-0be0aa6ccd9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlg79" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.374215 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/60193856-8a3f-4ce0-b79d-44e58de19b06-console-oauth-config\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.374725 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a0b7635-6423-4716-93b4-c8e9b3012e55-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vjcp6\" (UID: \"7a0b7635-6423-4716-93b4-c8e9b3012e55\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vjcp6" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.375808 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e98c46b0-e56f-4a29-a194-a39fc2401cfa-serving-cert\") pod \"etcd-operator-b45778765-hslbs\" (UID: \"e98c46b0-e56f-4a29-a194-a39fc2401cfa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.376154 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d27990e2-0de9-4cb5-a162-5949047f5d93-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mhstq\" (UID: \"d27990e2-0de9-4cb5-a162-5949047f5d93\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mhstq" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.377047 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a38e6005-63a6-4733-a269-f355c648fed4-serving-cert\") pod \"openshift-config-operator-7777fb866f-69l88\" (UID: \"a38e6005-63a6-4733-a269-f355c648fed4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-69l88" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.388727 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.399391 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d322779-6593-48b0-b504-792c5ae3b7a6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rqcwd\" (UID: \"8d322779-6593-48b0-b504-792c5ae3b7a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rqcwd" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.408427 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.428657 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.448730 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.452588 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d322779-6593-48b0-b504-792c5ae3b7a6-config\") pod \"kube-controller-manager-operator-78b949d7b-rqcwd\" (UID: \"8d322779-6593-48b0-b504-792c5ae3b7a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rqcwd" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.468850 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.470634 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl7s7\" (UniqueName: \"kubernetes.io/projected/adeab6d7-21b6-4ef2-afdb-75854f0914c5-kube-api-access-wl7s7\") pod \"control-plane-machine-set-operator-78cbb6b69f-wh6kv\" (UID: \"adeab6d7-21b6-4ef2-afdb-75854f0914c5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wh6kv" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.470710 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/adeab6d7-21b6-4ef2-afdb-75854f0914c5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wh6kv\" (UID: \"adeab6d7-21b6-4ef2-afdb-75854f0914c5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wh6kv" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.482325 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vgm4n\" (UID: \"f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vgm4n" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.488603 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.494827 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2151dc45-23c0-44b3-b896-0915da9f9d59-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zdgw8\" (UID: \"2151dc45-23c0-44b3-b896-0915da9f9d59\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zdgw8" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.509150 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.528977 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.548308 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.562017 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f83ab80b-b5bd-4274-99b9-b364923326bf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-q4rfp\" (UID: \"f83ab80b-b5bd-4274-99b9-b364923326bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4rfp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.569092 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.570852 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f83ab80b-b5bd-4274-99b9-b364923326bf-config\") pod \"kube-apiserver-operator-766d6c64bb-q4rfp\" (UID: \"f83ab80b-b5bd-4274-99b9-b364923326bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4rfp" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.609684 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.615524 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/adeab6d7-21b6-4ef2-afdb-75854f0914c5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wh6kv\" (UID: \"adeab6d7-21b6-4ef2-afdb-75854f0914c5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wh6kv" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.628508 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.648866 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.668385 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.688644 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.708463 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.728863 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.748083 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.768962 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.788671 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.808893 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.828914 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.849768 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.868375 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.889322 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.921041 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.928971 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.949470 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.969309 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 13 09:15:59 crc kubenswrapper[4841]: I0313 09:15:59.989629 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.009309 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.029875 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.049052 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.070315 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.089207 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.109455 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.129104 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.141920 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556556-c9nft"] Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.143007 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556556-c9nft" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.151586 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.154058 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556556-c9nft"] Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.169223 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.188667 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.209140 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.241850 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.249193 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.267445 4841 request.go:700] Waited for 1.018471961s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/secrets?fieldSelector=metadata.name%3Dimage-registry-operator-tls&limit=500&resourceVersion=0 Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.269528 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.289625 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.309673 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.329486 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.349809 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.370130 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.389598 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.408838 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.430251 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.449468 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.469161 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.489654 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.508994 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.529888 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.548980 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.569639 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.589314 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.608710 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.629121 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.649735 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.669412 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.717368 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24dm2\" (UniqueName: \"kubernetes.io/projected/4d40b190-4e17-46d9-85f7-f4062ea2fc47-kube-api-access-24dm2\") pod \"controller-manager-879f6c89f-86p7z\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.727842 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvj6z\" (UniqueName: \"kubernetes.io/projected/63754623-7341-47a8-8e37-e069a35cacd4-kube-api-access-mvj6z\") pod \"openshift-controller-manager-operator-756b6f6bc6-qtpx7\" (UID: \"63754623-7341-47a8-8e37-e069a35cacd4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qtpx7" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.730388 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qtpx7" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.731464 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.744966 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qzkh\" (UniqueName: \"kubernetes.io/projected/3afaa466-07d9-4168-a4bc-4af2c328f8fe-kube-api-access-6qzkh\") pod \"console-operator-58897d9998-2sr9r\" (UID: \"3afaa466-07d9-4168-a4bc-4af2c328f8fe\") " pod="openshift-console-operator/console-operator-58897d9998-2sr9r" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.769289 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.770222 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nm6w\" (UniqueName: \"kubernetes.io/projected/c205e4f8-77f0-43a5-8b2a-cccc1381ecd5-kube-api-access-5nm6w\") pod \"apiserver-76f77b778f-nl9wf\" (UID: \"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5\") " pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.808506 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.811492 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5knd\" (UniqueName: \"kubernetes.io/projected/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-kube-api-access-n5knd\") pod \"route-controller-manager-6576b87f9c-zw4tg\" (UID: \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.830225 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.861101 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.868134 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.888718 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.907662 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2sr9r" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.909593 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.939133 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.941786 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.948210 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.969255 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 09:16:00 crc kubenswrapper[4841]: I0313 09:16:00.973730 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.004123 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv7jv\" (UniqueName: \"kubernetes.io/projected/b15787fb-bbd9-459b-bd8d-9f54eb62d8a3-kube-api-access-zv7jv\") pod \"apiserver-7bbb656c7d-ld4cm\" (UID: \"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.025016 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5wnj\" (UniqueName: \"kubernetes.io/projected/53f88379-f3fd-4035-889f-2ff40f5bd9cc-kube-api-access-j5wnj\") pod \"authentication-operator-69f744f599-wxvhj\" (UID: \"53f88379-f3fd-4035-889f-2ff40f5bd9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.042785 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5w4j\" (UniqueName: \"kubernetes.io/projected/fa351e74-995f-4273-b0c6-ed8f57932d7b-kube-api-access-p5w4j\") pod \"machine-approver-56656f9798-6bfpp\" (UID: \"fa351e74-995f-4273-b0c6-ed8f57932d7b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.052537 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.065514 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkgdc\" (UniqueName: \"kubernetes.io/projected/293f3a0e-401d-440f-8321-0aac18b90219-kube-api-access-fkgdc\") pod \"oauth-openshift-558db77b4-zsbmr\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.068309 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.088872 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.108891 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.119538 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nl9wf"] Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.130365 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.133479 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2sr9r"] Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.133685 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp" Mar 13 09:16:01 crc kubenswrapper[4841]: W0313 09:16:01.153492 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3afaa466_07d9_4168_a4bc_4af2c328f8fe.slice/crio-fdde6ebb7b1a6687e65fbeb3ee409acc3d9d41b2000bfa30607cd4c1bfe76502 WatchSource:0}: Error finding container fdde6ebb7b1a6687e65fbeb3ee409acc3d9d41b2000bfa30607cd4c1bfe76502: Status 404 returned error can't find the container with id fdde6ebb7b1a6687e65fbeb3ee409acc3d9d41b2000bfa30607cd4c1bfe76502 Mar 13 09:16:01 crc kubenswrapper[4841]: W0313 09:16:01.154384 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa351e74_995f_4273_b0c6_ed8f57932d7b.slice/crio-1ec7645a2776b7d7867751754d3af3cc0a76a281e6acd74c91b76b57aaafcf17 WatchSource:0}: Error finding container 1ec7645a2776b7d7867751754d3af3cc0a76a281e6acd74c91b76b57aaafcf17: Status 404 returned error can't find the container with id 1ec7645a2776b7d7867751754d3af3cc0a76a281e6acd74c91b76b57aaafcf17 Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.169746 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.172366 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-86p7z"] Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.174609 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg"] Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.176590 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qtpx7"] Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.187527 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.189156 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.208145 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.224663 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wxvhj"] Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.228239 4841 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.248295 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 13 09:16:01 crc kubenswrapper[4841]: W0313 09:16:01.253643 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53f88379_f3fd_4035_889f_2ff40f5bd9cc.slice/crio-36a106557bb059126916e81b58c4fe67c042a02c14c1c2e4abcf3096bcc772b6 WatchSource:0}: Error finding container 36a106557bb059126916e81b58c4fe67c042a02c14c1c2e4abcf3096bcc772b6: Status 404 returned error can't find the container with id 36a106557bb059126916e81b58c4fe67c042a02c14c1c2e4abcf3096bcc772b6 Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.268483 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.287548 4841 request.go:700] Waited for 1.933548588s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dnode-bootstrapper-token&limit=500&resourceVersion=0 Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.290594 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.319073 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.328949 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.362673 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.368627 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5dkc\" (UniqueName: \"kubernetes.io/projected/ede1a06d-6caf-41ae-acfa-2335821a2e0e-kube-api-access-z5dkc\") pod \"downloads-7954f5f757-lrsts\" (UID: \"ede1a06d-6caf-41ae-acfa-2335821a2e0e\") " pod="openshift-console/downloads-7954f5f757-lrsts" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.382511 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm"] Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.387717 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66d67\" (UniqueName: \"kubernetes.io/projected/d27990e2-0de9-4cb5-a162-5949047f5d93-kube-api-access-66d67\") pod \"openshift-apiserver-operator-796bbdcf4f-mhstq\" (UID: \"d27990e2-0de9-4cb5-a162-5949047f5d93\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mhstq" Mar 13 09:16:01 crc kubenswrapper[4841]: W0313 09:16:01.391564 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb15787fb_bbd9_459b_bd8d_9f54eb62d8a3.slice/crio-b4ffe5f57aed15207830665f018a2d051eaba6cd744b723ecae4a5cfeffc8f46 WatchSource:0}: Error finding container b4ffe5f57aed15207830665f018a2d051eaba6cd744b723ecae4a5cfeffc8f46: Status 404 returned error can't find the container with id b4ffe5f57aed15207830665f018a2d051eaba6cd744b723ecae4a5cfeffc8f46 Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.407660 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9shz\" (UniqueName: \"kubernetes.io/projected/a38e6005-63a6-4733-a269-f355c648fed4-kube-api-access-q9shz\") pod \"openshift-config-operator-7777fb866f-69l88\" (UID: \"a38e6005-63a6-4733-a269-f355c648fed4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-69l88" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.421698 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d322779-6593-48b0-b504-792c5ae3b7a6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rqcwd\" (UID: \"8d322779-6593-48b0-b504-792c5ae3b7a6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rqcwd" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.441632 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-69l88" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.444293 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gt7n\" (UniqueName: \"kubernetes.io/projected/e98c46b0-e56f-4a29-a194-a39fc2401cfa-kube-api-access-9gt7n\") pod \"etcd-operator-b45778765-hslbs\" (UID: \"e98c46b0-e56f-4a29-a194-a39fc2401cfa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.465876 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4tl2\" (UniqueName: \"kubernetes.io/projected/7a0b7635-6423-4716-93b4-c8e9b3012e55-kube-api-access-s4tl2\") pod \"cluster-samples-operator-665b6dd947-vjcp6\" (UID: \"7a0b7635-6423-4716-93b4-c8e9b3012e55\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vjcp6" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.472298 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.489128 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f83ab80b-b5bd-4274-99b9-b364923326bf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-q4rfp\" (UID: \"f83ab80b-b5bd-4274-99b9-b364923326bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4rfp" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.504234 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxgqk\" (UniqueName: \"kubernetes.io/projected/747a24f9-e654-421a-8da1-0be0aa6ccd9b-kube-api-access-gxgqk\") pod \"machine-api-operator-5694c8668f-qlg79\" (UID: \"747a24f9-e654-421a-8da1-0be0aa6ccd9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlg79" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.507179 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-lrsts" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.518715 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mhstq" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.521625 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lstx\" (UniqueName: \"kubernetes.io/projected/2151dc45-23c0-44b3-b896-0915da9f9d59-kube-api-access-5lstx\") pod \"multus-admission-controller-857f4d67dd-zdgw8\" (UID: \"2151dc45-23c0-44b3-b896-0915da9f9d59\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zdgw8" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.527508 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rqcwd" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.539569 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-zdgw8" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.543839 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m29zc\" (UniqueName: \"kubernetes.io/projected/60193856-8a3f-4ce0-b79d-44e58de19b06-kube-api-access-m29zc\") pod \"console-f9d7485db-v6vhc\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.544342 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4rfp" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.583680 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zsbmr"] Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.584900 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhx8x\" (UniqueName: \"kubernetes.io/projected/f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa-kube-api-access-jhx8x\") pod \"kube-storage-version-migrator-operator-b67b599dd-vgm4n\" (UID: \"f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vgm4n" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.590024 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl7s7\" (UniqueName: \"kubernetes.io/projected/adeab6d7-21b6-4ef2-afdb-75854f0914c5-kube-api-access-wl7s7\") pod \"control-plane-machine-set-operator-78cbb6b69f-wh6kv\" (UID: \"adeab6d7-21b6-4ef2-afdb-75854f0914c5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wh6kv" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.620422 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2dnx\" (UniqueName: \"kubernetes.io/projected/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-kube-api-access-s2dnx\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.620797 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/249068ea-529c-4fe4-b357-0b35a9cfd5d7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bvfhj\" (UID: \"249068ea-529c-4fe4-b357-0b35a9cfd5d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bvfhj" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.620883 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-registry-tls\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.620917 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-bound-sa-token\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.620952 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1745fdf3-fbb9-4736-a4da-b534d8c208bd-config-volume\") pod \"collect-profiles-29556555-zbvz4\" (UID: \"1745fdf3-fbb9-4736-a4da-b534d8c208bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.620972 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249068ea-529c-4fe4-b357-0b35a9cfd5d7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bvfhj\" (UID: \"249068ea-529c-4fe4-b357-0b35a9cfd5d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bvfhj" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.621010 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzp5s\" (UniqueName: \"kubernetes.io/projected/ebf5c5f8-1870-470e-b63d-f84c99bdb936-kube-api-access-wzp5s\") pod \"dns-operator-744455d44c-qcfvl\" (UID: \"ebf5c5f8-1870-470e-b63d-f84c99bdb936\") " pod="openshift-dns-operator/dns-operator-744455d44c-qcfvl" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.621030 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptb5f\" (UniqueName: \"kubernetes.io/projected/4a2c849d-df3a-45e3-b868-859cec0f55a0-kube-api-access-ptb5f\") pod \"packageserver-d55dfcdfc-7kz2p\" (UID: \"4a2c849d-df3a-45e3-b868-859cec0f55a0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.621048 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1745fdf3-fbb9-4736-a4da-b534d8c208bd-secret-volume\") pod \"collect-profiles-29556555-zbvz4\" (UID: \"1745fdf3-fbb9-4736-a4da-b534d8c208bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.621070 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c9dbe40-1d26-4eec-b870-5ac342b005be-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q7t6x\" (UID: \"4c9dbe40-1d26-4eec-b870-5ac342b005be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.621131 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s72k9\" (UniqueName: \"kubernetes.io/projected/8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f-kube-api-access-s72k9\") pod \"router-default-5444994796-8kvzk\" (UID: \"8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f\") " pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.621155 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fad88931-0cb1-40fd-b256-f9cd1c93a7e6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6t6zf\" (UID: \"fad88931-0cb1-40fd-b256-f9cd1c93a7e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.621250 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe784c39-4b98-4b26-af66-d47a500ce697-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xxwmv\" (UID: \"fe784c39-4b98-4b26-af66-d47a500ce697\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xxwmv" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.624545 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b8c5df2-d1ee-429f-a269-72122ee12baf-proxy-tls\") pod \"machine-config-controller-84d6567774-6cdpl\" (UID: \"9b8c5df2-d1ee-429f-a269-72122ee12baf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cdpl" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.624974 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mkgp\" (UniqueName: \"kubernetes.io/projected/fe784c39-4b98-4b26-af66-d47a500ce697-kube-api-access-4mkgp\") pod \"package-server-manager-789f6589d5-xxwmv\" (UID: \"fe784c39-4b98-4b26-af66-d47a500ce697\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xxwmv" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.625142 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205-profile-collector-cert\") pod \"catalog-operator-68c6474976-khk8q\" (UID: \"7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.625204 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4a2c849d-df3a-45e3-b868-859cec0f55a0-tmpfs\") pod \"packageserver-d55dfcdfc-7kz2p\" (UID: \"4a2c849d-df3a-45e3-b868-859cec0f55a0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.625385 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1fcd85b1-ea32-49f2-a4a4-ed5b9995249f-metrics-tls\") pod \"ingress-operator-5b745b69d9-4d5n4\" (UID: \"1fcd85b1-ea32-49f2-a4a4-ed5b9995249f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.625663 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-registry-certificates\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.625697 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g26wp\" (UniqueName: \"kubernetes.io/projected/fad88931-0cb1-40fd-b256-f9cd1c93a7e6-kube-api-access-g26wp\") pod \"marketplace-operator-79b997595-6t6zf\" (UID: \"fad88931-0cb1-40fd-b256-f9cd1c93a7e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.625723 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6rvk\" (UniqueName: \"kubernetes.io/projected/f6103ec2-c668-4a2b-adfe-d31fb06bb5e3-kube-api-access-r6rvk\") pod \"machine-config-operator-74547568cd-ckw4c\" (UID: \"f6103ec2-c668-4a2b-adfe-d31fb06bb5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.625746 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.625765 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fff23c02-970f-41d9-9f60-ec0664b17186-signing-key\") pod \"service-ca-9c57cc56f-rm4kk\" (UID: \"fff23c02-970f-41d9-9f60-ec0664b17186\") " pod="openshift-service-ca/service-ca-9c57cc56f-rm4kk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.625786 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrg7b\" (UniqueName: \"kubernetes.io/projected/fff23c02-970f-41d9-9f60-ec0664b17186-kube-api-access-vrg7b\") pod \"service-ca-9c57cc56f-rm4kk\" (UID: \"fff23c02-970f-41d9-9f60-ec0664b17186\") " pod="openshift-service-ca/service-ca-9c57cc56f-rm4kk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.625829 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f752e6f-1e5a-4694-80de-379a9c00ff96-serving-cert\") pod \"service-ca-operator-777779d784-5rn52\" (UID: \"3f752e6f-1e5a-4694-80de-379a9c00ff96\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rn52" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.625853 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4rt5\" (UniqueName: \"kubernetes.io/projected/4c9dbe40-1d26-4eec-b870-5ac342b005be-kube-api-access-h4rt5\") pod \"cluster-image-registry-operator-dc59b4c8b-q7t6x\" (UID: \"4c9dbe40-1d26-4eec-b870-5ac342b005be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.625873 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtzvx\" (UniqueName: \"kubernetes.io/projected/1fcd85b1-ea32-49f2-a4a4-ed5b9995249f-kube-api-access-rtzvx\") pod \"ingress-operator-5b745b69d9-4d5n4\" (UID: \"1fcd85b1-ea32-49f2-a4a4-ed5b9995249f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.625895 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fff23c02-970f-41d9-9f60-ec0664b17186-signing-cabundle\") pod \"service-ca-9c57cc56f-rm4kk\" (UID: \"fff23c02-970f-41d9-9f60-ec0664b17186\") " pod="openshift-service-ca/service-ca-9c57cc56f-rm4kk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.625944 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4a2c849d-df3a-45e3-b868-859cec0f55a0-webhook-cert\") pod \"packageserver-d55dfcdfc-7kz2p\" (UID: \"4a2c849d-df3a-45e3-b868-859cec0f55a0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.625966 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f-stats-auth\") pod \"router-default-5444994796-8kvzk\" (UID: \"8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f\") " pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.626013 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9b8c5df2-d1ee-429f-a269-72122ee12baf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6cdpl\" (UID: \"9b8c5df2-d1ee-429f-a269-72122ee12baf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cdpl" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.626037 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgzwx\" (UniqueName: \"kubernetes.io/projected/ba0466cd-0dd4-47b7-92cf-f445820086e4-kube-api-access-dgzwx\") pod \"ingress-canary-m4fpx\" (UID: \"ba0466cd-0dd4-47b7-92cf-f445820086e4\") " pod="openshift-ingress-canary/ingress-canary-m4fpx" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.626072 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.626095 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4a2c849d-df3a-45e3-b868-859cec0f55a0-apiservice-cert\") pod \"packageserver-d55dfcdfc-7kz2p\" (UID: \"4a2c849d-df3a-45e3-b868-859cec0f55a0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.626117 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74mht\" (UniqueName: \"kubernetes.io/projected/fe9244af-7a05-4be3-bc6a-b8430139ce47-kube-api-access-74mht\") pod \"migrator-59844c95c7-6lggf\" (UID: \"fe9244af-7a05-4be3-bc6a-b8430139ce47\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6lggf" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.626152 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: E0313 09:16:01.626688 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:02.126669625 +0000 UTC m=+244.856569906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.626872 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c9dbe40-1d26-4eec-b870-5ac342b005be-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q7t6x\" (UID: \"4c9dbe40-1d26-4eec-b870-5ac342b005be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.626960 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f752e6f-1e5a-4694-80de-379a9c00ff96-config\") pod \"service-ca-operator-777779d784-5rn52\" (UID: \"3f752e6f-1e5a-4694-80de-379a9c00ff96\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rn52" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.626996 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ebf5c5f8-1870-470e-b63d-f84c99bdb936-metrics-tls\") pod \"dns-operator-744455d44c-qcfvl\" (UID: \"ebf5c5f8-1870-470e-b63d-f84c99bdb936\") " pod="openshift-dns-operator/dns-operator-744455d44c-qcfvl" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.627045 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7czgg\" (UniqueName: \"kubernetes.io/projected/7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205-kube-api-access-7czgg\") pod \"catalog-operator-68c6474976-khk8q\" (UID: \"7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.627126 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tzt8\" (UniqueName: \"kubernetes.io/projected/dd714e2b-d16f-4895-a438-b2f1b8235008-kube-api-access-7tzt8\") pod \"olm-operator-6b444d44fb-sbpds\" (UID: \"dd714e2b-d16f-4895-a438-b2f1b8235008\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.627247 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba0466cd-0dd4-47b7-92cf-f445820086e4-cert\") pod \"ingress-canary-m4fpx\" (UID: \"ba0466cd-0dd4-47b7-92cf-f445820086e4\") " pod="openshift-ingress-canary/ingress-canary-m4fpx" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.627282 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dd714e2b-d16f-4895-a438-b2f1b8235008-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sbpds\" (UID: \"dd714e2b-d16f-4895-a438-b2f1b8235008\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.627303 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fad88931-0cb1-40fd-b256-f9cd1c93a7e6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6t6zf\" (UID: \"fad88931-0cb1-40fd-b256-f9cd1c93a7e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.627409 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvvb9\" (UniqueName: \"kubernetes.io/projected/1745fdf3-fbb9-4736-a4da-b534d8c208bd-kube-api-access-fvvb9\") pod \"collect-profiles-29556555-zbvz4\" (UID: \"1745fdf3-fbb9-4736-a4da-b534d8c208bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.627428 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dd714e2b-d16f-4895-a438-b2f1b8235008-srv-cert\") pod \"olm-operator-6b444d44fb-sbpds\" (UID: \"dd714e2b-d16f-4895-a438-b2f1b8235008\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.627443 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205-srv-cert\") pod \"catalog-operator-68c6474976-khk8q\" (UID: \"7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.628581 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6103ec2-c668-4a2b-adfe-d31fb06bb5e3-proxy-tls\") pod \"machine-config-operator-74547568cd-ckw4c\" (UID: \"f6103ec2-c668-4a2b-adfe-d31fb06bb5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.628624 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1fcd85b1-ea32-49f2-a4a4-ed5b9995249f-trusted-ca\") pod \"ingress-operator-5b745b69d9-4d5n4\" (UID: \"1fcd85b1-ea32-49f2-a4a4-ed5b9995249f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.629047 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gt25\" (UniqueName: \"kubernetes.io/projected/9b8c5df2-d1ee-429f-a269-72122ee12baf-kube-api-access-9gt25\") pod \"machine-config-controller-84d6567774-6cdpl\" (UID: \"9b8c5df2-d1ee-429f-a269-72122ee12baf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cdpl" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.629098 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f6103ec2-c668-4a2b-adfe-d31fb06bb5e3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ckw4c\" (UID: \"f6103ec2-c668-4a2b-adfe-d31fb06bb5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.629520 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f-default-certificate\") pod \"router-default-5444994796-8kvzk\" (UID: \"8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f\") " pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.629563 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f-service-ca-bundle\") pod \"router-default-5444994796-8kvzk\" (UID: \"8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f\") " pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.629632 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f6103ec2-c668-4a2b-adfe-d31fb06bb5e3-images\") pod \"machine-config-operator-74547568cd-ckw4c\" (UID: \"f6103ec2-c668-4a2b-adfe-d31fb06bb5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.630011 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/249068ea-529c-4fe4-b357-0b35a9cfd5d7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bvfhj\" (UID: \"249068ea-529c-4fe4-b357-0b35a9cfd5d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bvfhj" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.630091 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9wrv\" (UniqueName: \"kubernetes.io/projected/344d8ef8-1693-4dcc-b09d-dc9fc7c041cc-kube-api-access-q9wrv\") pod \"auto-csr-approver-29556554-2dgrd\" (UID: \"344d8ef8-1693-4dcc-b09d-dc9fc7c041cc\") " pod="openshift-infra/auto-csr-approver-29556554-2dgrd" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.630288 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f-metrics-certs\") pod \"router-default-5444994796-8kvzk\" (UID: \"8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f\") " pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.630326 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1fcd85b1-ea32-49f2-a4a4-ed5b9995249f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4d5n4\" (UID: \"1fcd85b1-ea32-49f2-a4a4-ed5b9995249f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.630405 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c9dbe40-1d26-4eec-b870-5ac342b005be-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q7t6x\" (UID: \"4c9dbe40-1d26-4eec-b870-5ac342b005be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.630640 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-trusted-ca\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.630672 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqbdf\" (UniqueName: \"kubernetes.io/projected/3f752e6f-1e5a-4694-80de-379a9c00ff96-kube-api-access-vqbdf\") pod \"service-ca-operator-777779d784-5rn52\" (UID: \"3f752e6f-1e5a-4694-80de-379a9c00ff96\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rn52" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.710394 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" event={"ID":"293f3a0e-401d-440f-8321-0aac18b90219","Type":"ContainerStarted","Data":"acecc330912e15e4e243341fce1c2d6655c36e411cf8d0c20bc6294a07ce6518"} Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.720475 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" event={"ID":"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3","Type":"ContainerStarted","Data":"b4ffe5f57aed15207830665f018a2d051eaba6cd744b723ecae4a5cfeffc8f46"} Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.731152 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-69l88"] Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.731760 4841 generic.go:334] "Generic (PLEG): container finished" podID="c205e4f8-77f0-43a5-8b2a-cccc1381ecd5" containerID="7e76e7973f4a940b07372cbd3382d14fcc35b33f879d9d858c81be38176c6601" exitCode=0 Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.731912 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" event={"ID":"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5","Type":"ContainerDied","Data":"7e76e7973f4a940b07372cbd3382d14fcc35b33f879d9d858c81be38176c6601"} Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.731985 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" event={"ID":"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5","Type":"ContainerStarted","Data":"361050d45e9271fb37a9a8a2a29b5ae01cdf24e6c7d59a3423ca2b06eb0b7768"} Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.735747 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:01 crc kubenswrapper[4841]: E0313 09:16:01.735882 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:02.23585784 +0000 UTC m=+244.965758031 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736011 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86c4fd3a-5374-4b50-8f43-301205a71684-metrics-tls\") pod \"dns-default-bbjqp\" (UID: \"86c4fd3a-5374-4b50-8f43-301205a71684\") " pod="openshift-dns/dns-default-bbjqp" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736038 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe784c39-4b98-4b26-af66-d47a500ce697-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xxwmv\" (UID: \"fe784c39-4b98-4b26-af66-d47a500ce697\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xxwmv" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736059 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205-profile-collector-cert\") pod \"catalog-operator-68c6474976-khk8q\" (UID: \"7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736076 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b8c5df2-d1ee-429f-a269-72122ee12baf-proxy-tls\") pod \"machine-config-controller-84d6567774-6cdpl\" (UID: \"9b8c5df2-d1ee-429f-a269-72122ee12baf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cdpl" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736094 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mkgp\" (UniqueName: \"kubernetes.io/projected/fe784c39-4b98-4b26-af66-d47a500ce697-kube-api-access-4mkgp\") pod \"package-server-manager-789f6589d5-xxwmv\" (UID: \"fe784c39-4b98-4b26-af66-d47a500ce697\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xxwmv" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736112 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4a2c849d-df3a-45e3-b868-859cec0f55a0-tmpfs\") pod \"packageserver-d55dfcdfc-7kz2p\" (UID: \"4a2c849d-df3a-45e3-b868-859cec0f55a0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736133 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1fcd85b1-ea32-49f2-a4a4-ed5b9995249f-metrics-tls\") pod \"ingress-operator-5b745b69d9-4d5n4\" (UID: \"1fcd85b1-ea32-49f2-a4a4-ed5b9995249f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736172 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-registry-certificates\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736166 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2sr9r" event={"ID":"3afaa466-07d9-4168-a4bc-4af2c328f8fe","Type":"ContainerStarted","Data":"2fba92778b383afc57eff1f0559e01a17369eede8f876461a5a547147cbd0902"} Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736206 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2sr9r" event={"ID":"3afaa466-07d9-4168-a4bc-4af2c328f8fe","Type":"ContainerStarted","Data":"fdde6ebb7b1a6687e65fbeb3ee409acc3d9d41b2000bfa30607cd4c1bfe76502"} Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736187 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g26wp\" (UniqueName: \"kubernetes.io/projected/fad88931-0cb1-40fd-b256-f9cd1c93a7e6-kube-api-access-g26wp\") pod \"marketplace-operator-79b997595-6t6zf\" (UID: \"fad88931-0cb1-40fd-b256-f9cd1c93a7e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736247 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrg7b\" (UniqueName: \"kubernetes.io/projected/fff23c02-970f-41d9-9f60-ec0664b17186-kube-api-access-vrg7b\") pod \"service-ca-9c57cc56f-rm4kk\" (UID: \"fff23c02-970f-41d9-9f60-ec0664b17186\") " pod="openshift-service-ca/service-ca-9c57cc56f-rm4kk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736311 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6rvk\" (UniqueName: \"kubernetes.io/projected/f6103ec2-c668-4a2b-adfe-d31fb06bb5e3-kube-api-access-r6rvk\") pod \"machine-config-operator-74547568cd-ckw4c\" (UID: \"f6103ec2-c668-4a2b-adfe-d31fb06bb5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736331 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736348 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fff23c02-970f-41d9-9f60-ec0664b17186-signing-key\") pod \"service-ca-9c57cc56f-rm4kk\" (UID: \"fff23c02-970f-41d9-9f60-ec0664b17186\") " pod="openshift-service-ca/service-ca-9c57cc56f-rm4kk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736365 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtzvx\" (UniqueName: \"kubernetes.io/projected/1fcd85b1-ea32-49f2-a4a4-ed5b9995249f-kube-api-access-rtzvx\") pod \"ingress-operator-5b745b69d9-4d5n4\" (UID: \"1fcd85b1-ea32-49f2-a4a4-ed5b9995249f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736387 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/363c526f-c53a-49f8-88bc-823b1ccde350-plugins-dir\") pod \"csi-hostpathplugin-md725\" (UID: \"363c526f-c53a-49f8-88bc-823b1ccde350\") " pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736403 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f752e6f-1e5a-4694-80de-379a9c00ff96-serving-cert\") pod \"service-ca-operator-777779d784-5rn52\" (UID: \"3f752e6f-1e5a-4694-80de-379a9c00ff96\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rn52" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736424 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4rt5\" (UniqueName: \"kubernetes.io/projected/4c9dbe40-1d26-4eec-b870-5ac342b005be-kube-api-access-h4rt5\") pod \"cluster-image-registry-operator-dc59b4c8b-q7t6x\" (UID: \"4c9dbe40-1d26-4eec-b870-5ac342b005be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736440 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fff23c02-970f-41d9-9f60-ec0664b17186-signing-cabundle\") pod \"service-ca-9c57cc56f-rm4kk\" (UID: \"fff23c02-970f-41d9-9f60-ec0664b17186\") " pod="openshift-service-ca/service-ca-9c57cc56f-rm4kk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736458 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f48vc\" (UniqueName: \"kubernetes.io/projected/86c4fd3a-5374-4b50-8f43-301205a71684-kube-api-access-f48vc\") pod \"dns-default-bbjqp\" (UID: \"86c4fd3a-5374-4b50-8f43-301205a71684\") " pod="openshift-dns/dns-default-bbjqp" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736475 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4a2c849d-df3a-45e3-b868-859cec0f55a0-webhook-cert\") pod \"packageserver-d55dfcdfc-7kz2p\" (UID: \"4a2c849d-df3a-45e3-b868-859cec0f55a0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736491 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f-stats-auth\") pod \"router-default-5444994796-8kvzk\" (UID: \"8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f\") " pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736508 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgzwx\" (UniqueName: \"kubernetes.io/projected/ba0466cd-0dd4-47b7-92cf-f445820086e4-kube-api-access-dgzwx\") pod \"ingress-canary-m4fpx\" (UID: \"ba0466cd-0dd4-47b7-92cf-f445820086e4\") " pod="openshift-ingress-canary/ingress-canary-m4fpx" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736525 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9b8c5df2-d1ee-429f-a269-72122ee12baf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6cdpl\" (UID: \"9b8c5df2-d1ee-429f-a269-72122ee12baf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cdpl" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736541 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4a2c849d-df3a-45e3-b868-859cec0f55a0-apiservice-cert\") pod \"packageserver-d55dfcdfc-7kz2p\" (UID: \"4a2c849d-df3a-45e3-b868-859cec0f55a0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736562 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736579 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74mht\" (UniqueName: \"kubernetes.io/projected/fe9244af-7a05-4be3-bc6a-b8430139ce47-kube-api-access-74mht\") pod \"migrator-59844c95c7-6lggf\" (UID: \"fe9244af-7a05-4be3-bc6a-b8430139ce47\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6lggf" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736600 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736646 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c9dbe40-1d26-4eec-b870-5ac342b005be-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q7t6x\" (UID: \"4c9dbe40-1d26-4eec-b870-5ac342b005be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736672 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f752e6f-1e5a-4694-80de-379a9c00ff96-config\") pod \"service-ca-operator-777779d784-5rn52\" (UID: \"3f752e6f-1e5a-4694-80de-379a9c00ff96\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rn52" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736689 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ebf5c5f8-1870-470e-b63d-f84c99bdb936-metrics-tls\") pod \"dns-operator-744455d44c-qcfvl\" (UID: \"ebf5c5f8-1870-470e-b63d-f84c99bdb936\") " pod="openshift-dns-operator/dns-operator-744455d44c-qcfvl" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736709 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7czgg\" (UniqueName: \"kubernetes.io/projected/7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205-kube-api-access-7czgg\") pod \"catalog-operator-68c6474976-khk8q\" (UID: \"7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736727 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpr2t\" (UniqueName: \"kubernetes.io/projected/f5961bba-4ec3-4b4e-b5a2-73aa1024326e-kube-api-access-wpr2t\") pod \"auto-csr-approver-29556556-c9nft\" (UID: \"f5961bba-4ec3-4b4e-b5a2-73aa1024326e\") " pod="openshift-infra/auto-csr-approver-29556556-c9nft" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736755 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tzt8\" (UniqueName: \"kubernetes.io/projected/dd714e2b-d16f-4895-a438-b2f1b8235008-kube-api-access-7tzt8\") pod \"olm-operator-6b444d44fb-sbpds\" (UID: \"dd714e2b-d16f-4895-a438-b2f1b8235008\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736772 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba0466cd-0dd4-47b7-92cf-f445820086e4-cert\") pod \"ingress-canary-m4fpx\" (UID: \"ba0466cd-0dd4-47b7-92cf-f445820086e4\") " pod="openshift-ingress-canary/ingress-canary-m4fpx" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736790 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dd714e2b-d16f-4895-a438-b2f1b8235008-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sbpds\" (UID: \"dd714e2b-d16f-4895-a438-b2f1b8235008\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736811 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fad88931-0cb1-40fd-b256-f9cd1c93a7e6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6t6zf\" (UID: \"fad88931-0cb1-40fd-b256-f9cd1c93a7e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736844 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvvb9\" (UniqueName: \"kubernetes.io/projected/1745fdf3-fbb9-4736-a4da-b534d8c208bd-kube-api-access-fvvb9\") pod \"collect-profiles-29556555-zbvz4\" (UID: \"1745fdf3-fbb9-4736-a4da-b534d8c208bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.736939 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738443 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/363c526f-c53a-49f8-88bc-823b1ccde350-csi-data-dir\") pod \"csi-hostpathplugin-md725\" (UID: \"363c526f-c53a-49f8-88bc-823b1ccde350\") " pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738494 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dd714e2b-d16f-4895-a438-b2f1b8235008-srv-cert\") pod \"olm-operator-6b444d44fb-sbpds\" (UID: \"dd714e2b-d16f-4895-a438-b2f1b8235008\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738521 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205-srv-cert\") pod \"catalog-operator-68c6474976-khk8q\" (UID: \"7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738551 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6103ec2-c668-4a2b-adfe-d31fb06bb5e3-proxy-tls\") pod \"machine-config-operator-74547568cd-ckw4c\" (UID: \"f6103ec2-c668-4a2b-adfe-d31fb06bb5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738592 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1fcd85b1-ea32-49f2-a4a4-ed5b9995249f-trusted-ca\") pod \"ingress-operator-5b745b69d9-4d5n4\" (UID: \"1fcd85b1-ea32-49f2-a4a4-ed5b9995249f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738641 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gt25\" (UniqueName: \"kubernetes.io/projected/9b8c5df2-d1ee-429f-a269-72122ee12baf-kube-api-access-9gt25\") pod \"machine-config-controller-84d6567774-6cdpl\" (UID: \"9b8c5df2-d1ee-429f-a269-72122ee12baf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cdpl" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738683 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f6103ec2-c668-4a2b-adfe-d31fb06bb5e3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ckw4c\" (UID: \"f6103ec2-c668-4a2b-adfe-d31fb06bb5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738735 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f-default-certificate\") pod \"router-default-5444994796-8kvzk\" (UID: \"8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f\") " pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738757 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f-service-ca-bundle\") pod \"router-default-5444994796-8kvzk\" (UID: \"8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f\") " pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738780 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f6103ec2-c668-4a2b-adfe-d31fb06bb5e3-images\") pod \"machine-config-operator-74547568cd-ckw4c\" (UID: \"f6103ec2-c668-4a2b-adfe-d31fb06bb5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738803 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86c4fd3a-5374-4b50-8f43-301205a71684-config-volume\") pod \"dns-default-bbjqp\" (UID: \"86c4fd3a-5374-4b50-8f43-301205a71684\") " pod="openshift-dns/dns-default-bbjqp" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738839 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/249068ea-529c-4fe4-b357-0b35a9cfd5d7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bvfhj\" (UID: \"249068ea-529c-4fe4-b357-0b35a9cfd5d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bvfhj" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738864 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9wrv\" (UniqueName: \"kubernetes.io/projected/344d8ef8-1693-4dcc-b09d-dc9fc7c041cc-kube-api-access-q9wrv\") pod \"auto-csr-approver-29556554-2dgrd\" (UID: \"344d8ef8-1693-4dcc-b09d-dc9fc7c041cc\") " pod="openshift-infra/auto-csr-approver-29556554-2dgrd" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738886 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/363c526f-c53a-49f8-88bc-823b1ccde350-mountpoint-dir\") pod \"csi-hostpathplugin-md725\" (UID: \"363c526f-c53a-49f8-88bc-823b1ccde350\") " pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738911 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1fcd85b1-ea32-49f2-a4a4-ed5b9995249f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4d5n4\" (UID: \"1fcd85b1-ea32-49f2-a4a4-ed5b9995249f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738932 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a655e298-2051-4412-bbd7-b527adca30ff-node-bootstrap-token\") pod \"machine-config-server-dtk79\" (UID: \"a655e298-2051-4412-bbd7-b527adca30ff\") " pod="openshift-machine-config-operator/machine-config-server-dtk79" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738956 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f-metrics-certs\") pod \"router-default-5444994796-8kvzk\" (UID: \"8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f\") " pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738977 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c9dbe40-1d26-4eec-b870-5ac342b005be-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q7t6x\" (UID: \"4c9dbe40-1d26-4eec-b870-5ac342b005be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.738998 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/363c526f-c53a-49f8-88bc-823b1ccde350-socket-dir\") pod \"csi-hostpathplugin-md725\" (UID: \"363c526f-c53a-49f8-88bc-823b1ccde350\") " pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.739025 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-trusted-ca\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.739050 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqbdf\" (UniqueName: \"kubernetes.io/projected/3f752e6f-1e5a-4694-80de-379a9c00ff96-kube-api-access-vqbdf\") pod \"service-ca-operator-777779d784-5rn52\" (UID: \"3f752e6f-1e5a-4694-80de-379a9c00ff96\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rn52" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.739072 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dccqh\" (UniqueName: \"kubernetes.io/projected/363c526f-c53a-49f8-88bc-823b1ccde350-kube-api-access-dccqh\") pod \"csi-hostpathplugin-md725\" (UID: \"363c526f-c53a-49f8-88bc-823b1ccde350\") " pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.739095 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a655e298-2051-4412-bbd7-b527adca30ff-certs\") pod \"machine-config-server-dtk79\" (UID: \"a655e298-2051-4412-bbd7-b527adca30ff\") " pod="openshift-machine-config-operator/machine-config-server-dtk79" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.739119 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dnx\" (UniqueName: \"kubernetes.io/projected/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-kube-api-access-s2dnx\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.739135 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/249068ea-529c-4fe4-b357-0b35a9cfd5d7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bvfhj\" (UID: \"249068ea-529c-4fe4-b357-0b35a9cfd5d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bvfhj" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.739167 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-registry-tls\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.739182 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mz6c\" (UniqueName: \"kubernetes.io/projected/a655e298-2051-4412-bbd7-b527adca30ff-kube-api-access-5mz6c\") pod \"machine-config-server-dtk79\" (UID: \"a655e298-2051-4412-bbd7-b527adca30ff\") " pod="openshift-machine-config-operator/machine-config-server-dtk79" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.739204 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-bound-sa-token\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.739226 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1745fdf3-fbb9-4736-a4da-b534d8c208bd-config-volume\") pod \"collect-profiles-29556555-zbvz4\" (UID: \"1745fdf3-fbb9-4736-a4da-b534d8c208bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.739244 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249068ea-529c-4fe4-b357-0b35a9cfd5d7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bvfhj\" (UID: \"249068ea-529c-4fe4-b357-0b35a9cfd5d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bvfhj" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.739289 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/363c526f-c53a-49f8-88bc-823b1ccde350-registration-dir\") pod \"csi-hostpathplugin-md725\" (UID: \"363c526f-c53a-49f8-88bc-823b1ccde350\") " pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.739315 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzp5s\" (UniqueName: \"kubernetes.io/projected/ebf5c5f8-1870-470e-b63d-f84c99bdb936-kube-api-access-wzp5s\") pod \"dns-operator-744455d44c-qcfvl\" (UID: \"ebf5c5f8-1870-470e-b63d-f84c99bdb936\") " pod="openshift-dns-operator/dns-operator-744455d44c-qcfvl" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.739339 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptb5f\" (UniqueName: \"kubernetes.io/projected/4a2c849d-df3a-45e3-b868-859cec0f55a0-kube-api-access-ptb5f\") pod \"packageserver-d55dfcdfc-7kz2p\" (UID: \"4a2c849d-df3a-45e3-b868-859cec0f55a0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.739360 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1745fdf3-fbb9-4736-a4da-b534d8c208bd-secret-volume\") pod \"collect-profiles-29556555-zbvz4\" (UID: \"1745fdf3-fbb9-4736-a4da-b534d8c208bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.739393 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c9dbe40-1d26-4eec-b870-5ac342b005be-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q7t6x\" (UID: \"4c9dbe40-1d26-4eec-b870-5ac342b005be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.739422 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s72k9\" (UniqueName: \"kubernetes.io/projected/8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f-kube-api-access-s72k9\") pod \"router-default-5444994796-8kvzk\" (UID: \"8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f\") " pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.739458 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fad88931-0cb1-40fd-b256-f9cd1c93a7e6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6t6zf\" (UID: \"fad88931-0cb1-40fd-b256-f9cd1c93a7e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.743919 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f-service-ca-bundle\") pod \"router-default-5444994796-8kvzk\" (UID: \"8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f\") " pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.744546 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fad88931-0cb1-40fd-b256-f9cd1c93a7e6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6t6zf\" (UID: \"fad88931-0cb1-40fd-b256-f9cd1c93a7e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.744557 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fff23c02-970f-41d9-9f60-ec0664b17186-signing-cabundle\") pod \"service-ca-9c57cc56f-rm4kk\" (UID: \"fff23c02-970f-41d9-9f60-ec0664b17186\") " pod="openshift-service-ca/service-ca-9c57cc56f-rm4kk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.745012 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-2sr9r" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.745723 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f6103ec2-c668-4a2b-adfe-d31fb06bb5e3-images\") pod \"machine-config-operator-74547568cd-ckw4c\" (UID: \"f6103ec2-c668-4a2b-adfe-d31fb06bb5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.747191 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.747477 4841 patch_prober.go:28] interesting pod/console-operator-58897d9998-2sr9r container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/readyz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.747618 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2sr9r" podUID="3afaa466-07d9-4168-a4bc-4af2c328f8fe" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.6:8443/readyz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.750188 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hslbs"] Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.750979 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9b8c5df2-d1ee-429f-a269-72122ee12baf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6cdpl\" (UID: \"9b8c5df2-d1ee-429f-a269-72122ee12baf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cdpl" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.751710 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-registry-certificates\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.753107 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f6103ec2-c668-4a2b-adfe-d31fb06bb5e3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ckw4c\" (UID: \"f6103ec2-c668-4a2b-adfe-d31fb06bb5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.753529 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" event={"ID":"2faa4f36-0851-4da7-bd9a-dd0e50daf0de","Type":"ContainerStarted","Data":"f826a54b397d39eb84d511dc376f98ae2fb7bfbcb32a70ac3600a26d2f0ff8f7"} Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.753661 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" event={"ID":"2faa4f36-0851-4da7-bd9a-dd0e50daf0de","Type":"ContainerStarted","Data":"b755895c7d38a660945ac664b5c678f6c59f28f8df7a0cad6ced2a31ccdf6f23"} Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.753982 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.754613 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1fcd85b1-ea32-49f2-a4a4-ed5b9995249f-trusted-ca\") pod \"ingress-operator-5b745b69d9-4d5n4\" (UID: \"1fcd85b1-ea32-49f2-a4a4-ed5b9995249f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.755012 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4a2c849d-df3a-45e3-b868-859cec0f55a0-tmpfs\") pod \"packageserver-d55dfcdfc-7kz2p\" (UID: \"4a2c849d-df3a-45e3-b868-859cec0f55a0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.755856 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c9dbe40-1d26-4eec-b870-5ac342b005be-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q7t6x\" (UID: \"4c9dbe40-1d26-4eec-b870-5ac342b005be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.756499 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1745fdf3-fbb9-4736-a4da-b534d8c208bd-config-volume\") pod \"collect-profiles-29556555-zbvz4\" (UID: \"1745fdf3-fbb9-4736-a4da-b534d8c208bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.756933 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qlg79" Mar 13 09:16:01 crc kubenswrapper[4841]: E0313 09:16:01.757527 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:02.257505568 +0000 UTC m=+244.987405759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.759650 4841 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-zw4tg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.759769 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" podUID="2faa4f36-0851-4da7-bd9a-dd0e50daf0de" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.761123 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-trusted-ca\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.761161 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vjcp6" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.761778 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249068ea-529c-4fe4-b357-0b35a9cfd5d7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bvfhj\" (UID: \"249068ea-529c-4fe4-b357-0b35a9cfd5d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bvfhj" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.762091 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" event={"ID":"53f88379-f3fd-4035-889f-2ff40f5bd9cc","Type":"ContainerStarted","Data":"52640a19fa47fb433ecc620e9ff21e5ebe6f0185b810b143ab5b049f2ed3a4b2"} Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.762139 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" event={"ID":"53f88379-f3fd-4035-889f-2ff40f5bd9cc","Type":"ContainerStarted","Data":"36a106557bb059126916e81b58c4fe67c042a02c14c1c2e4abcf3096bcc772b6"} Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.763360 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f752e6f-1e5a-4694-80de-379a9c00ff96-config\") pod \"service-ca-operator-777779d784-5rn52\" (UID: \"3f752e6f-1e5a-4694-80de-379a9c00ff96\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rn52" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.763672 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp" event={"ID":"fa351e74-995f-4273-b0c6-ed8f57932d7b","Type":"ContainerStarted","Data":"4884c4a155ecbad078e905d748b900dcc5a253ac8a67805a53d8b34373422836"} Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.763847 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp" event={"ID":"fa351e74-995f-4273-b0c6-ed8f57932d7b","Type":"ContainerStarted","Data":"1ec7645a2776b7d7867751754d3af3cc0a76a281e6acd74c91b76b57aaafcf17"} Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.764176 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f-stats-auth\") pod \"router-default-5444994796-8kvzk\" (UID: \"8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f\") " pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.764950 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" event={"ID":"4d40b190-4e17-46d9-85f7-f4062ea2fc47","Type":"ContainerStarted","Data":"9f45a5c900e6d258fa313fcb6c32460d268e1a72cf2206bfd94a5930bb6d0a78"} Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.764990 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" event={"ID":"4d40b190-4e17-46d9-85f7-f4062ea2fc47","Type":"ContainerStarted","Data":"44bfdbfe697bc6dc67f0783b9470e52d36a6c5b8fe169cc700f83d05639e7700"} Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.765507 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.767225 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qtpx7" event={"ID":"63754623-7341-47a8-8e37-e069a35cacd4","Type":"ContainerStarted","Data":"91b678374ac45940cd0966de1221296ec5b061f80870e955bc7a5189db6c78b7"} Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.767314 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qtpx7" event={"ID":"63754623-7341-47a8-8e37-e069a35cacd4","Type":"ContainerStarted","Data":"9a377b56d531d9909b941228d508393d1223ee1a901573114cdd2bf9db5a806e"} Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.767497 4841 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-86p7z container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.767619 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" podUID="4d40b190-4e17-46d9-85f7-f4062ea2fc47" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.771658 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1fcd85b1-ea32-49f2-a4a4-ed5b9995249f-metrics-tls\") pod \"ingress-operator-5b745b69d9-4d5n4\" (UID: \"1fcd85b1-ea32-49f2-a4a4-ed5b9995249f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.771857 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f-metrics-certs\") pod \"router-default-5444994796-8kvzk\" (UID: \"8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f\") " pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.772399 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe784c39-4b98-4b26-af66-d47a500ce697-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xxwmv\" (UID: \"fe784c39-4b98-4b26-af66-d47a500ce697\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xxwmv" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.774389 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-registry-tls\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.775450 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6103ec2-c668-4a2b-adfe-d31fb06bb5e3-proxy-tls\") pod \"machine-config-operator-74547568cd-ckw4c\" (UID: \"f6103ec2-c668-4a2b-adfe-d31fb06bb5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.775927 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205-profile-collector-cert\") pod \"catalog-operator-68c6474976-khk8q\" (UID: \"7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.775942 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba0466cd-0dd4-47b7-92cf-f445820086e4-cert\") pod \"ingress-canary-m4fpx\" (UID: \"ba0466cd-0dd4-47b7-92cf-f445820086e4\") " pod="openshift-ingress-canary/ingress-canary-m4fpx" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.776757 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.777192 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fff23c02-970f-41d9-9f60-ec0664b17186-signing-key\") pod \"service-ca-9c57cc56f-rm4kk\" (UID: \"fff23c02-970f-41d9-9f60-ec0664b17186\") " pod="openshift-service-ca/service-ca-9c57cc56f-rm4kk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.777126 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dd714e2b-d16f-4895-a438-b2f1b8235008-srv-cert\") pod \"olm-operator-6b444d44fb-sbpds\" (UID: \"dd714e2b-d16f-4895-a438-b2f1b8235008\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.778732 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ebf5c5f8-1870-470e-b63d-f84c99bdb936-metrics-tls\") pod \"dns-operator-744455d44c-qcfvl\" (UID: \"ebf5c5f8-1870-470e-b63d-f84c99bdb936\") " pod="openshift-dns-operator/dns-operator-744455d44c-qcfvl" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.779160 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fad88931-0cb1-40fd-b256-f9cd1c93a7e6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6t6zf\" (UID: \"fad88931-0cb1-40fd-b256-f9cd1c93a7e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.780653 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b8c5df2-d1ee-429f-a269-72122ee12baf-proxy-tls\") pod \"machine-config-controller-84d6567774-6cdpl\" (UID: \"9b8c5df2-d1ee-429f-a269-72122ee12baf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cdpl" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.780706 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4a2c849d-df3a-45e3-b868-859cec0f55a0-webhook-cert\") pod \"packageserver-d55dfcdfc-7kz2p\" (UID: \"4a2c849d-df3a-45e3-b868-859cec0f55a0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.780734 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205-srv-cert\") pod \"catalog-operator-68c6474976-khk8q\" (UID: \"7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.780871 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c9dbe40-1d26-4eec-b870-5ac342b005be-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q7t6x\" (UID: \"4c9dbe40-1d26-4eec-b870-5ac342b005be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.783325 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f-default-certificate\") pod \"router-default-5444994796-8kvzk\" (UID: \"8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f\") " pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.783717 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1745fdf3-fbb9-4736-a4da-b534d8c208bd-secret-volume\") pod \"collect-profiles-29556555-zbvz4\" (UID: \"1745fdf3-fbb9-4736-a4da-b534d8c208bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.789009 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dd714e2b-d16f-4895-a438-b2f1b8235008-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sbpds\" (UID: \"dd714e2b-d16f-4895-a438-b2f1b8235008\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.795045 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/249068ea-529c-4fe4-b357-0b35a9cfd5d7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bvfhj\" (UID: \"249068ea-529c-4fe4-b357-0b35a9cfd5d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bvfhj" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.795318 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f752e6f-1e5a-4694-80de-379a9c00ff96-serving-cert\") pod \"service-ca-operator-777779d784-5rn52\" (UID: \"3f752e6f-1e5a-4694-80de-379a9c00ff96\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rn52" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.795309 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4a2c849d-df3a-45e3-b868-859cec0f55a0-apiservice-cert\") pod \"packageserver-d55dfcdfc-7kz2p\" (UID: \"4a2c849d-df3a-45e3-b868-859cec0f55a0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.797184 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g26wp\" (UniqueName: \"kubernetes.io/projected/fad88931-0cb1-40fd-b256-f9cd1c93a7e6-kube-api-access-g26wp\") pod \"marketplace-operator-79b997595-6t6zf\" (UID: \"fad88931-0cb1-40fd-b256-f9cd1c93a7e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.805960 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrg7b\" (UniqueName: \"kubernetes.io/projected/fff23c02-970f-41d9-9f60-ec0664b17186-kube-api-access-vrg7b\") pod \"service-ca-9c57cc56f-rm4kk\" (UID: \"fff23c02-970f-41d9-9f60-ec0664b17186\") " pod="openshift-service-ca/service-ca-9c57cc56f-rm4kk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.820991 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vgm4n" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.833227 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6rvk\" (UniqueName: \"kubernetes.io/projected/f6103ec2-c668-4a2b-adfe-d31fb06bb5e3-kube-api-access-r6rvk\") pod \"machine-config-operator-74547568cd-ckw4c\" (UID: \"f6103ec2-c668-4a2b-adfe-d31fb06bb5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.840395 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:01 crc kubenswrapper[4841]: E0313 09:16:01.840585 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:02.340559414 +0000 UTC m=+245.070459605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.840657 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f48vc\" (UniqueName: \"kubernetes.io/projected/86c4fd3a-5374-4b50-8f43-301205a71684-kube-api-access-f48vc\") pod \"dns-default-bbjqp\" (UID: \"86c4fd3a-5374-4b50-8f43-301205a71684\") " pod="openshift-dns/dns-default-bbjqp" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.840717 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.840856 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpr2t\" (UniqueName: \"kubernetes.io/projected/f5961bba-4ec3-4b4e-b5a2-73aa1024326e-kube-api-access-wpr2t\") pod \"auto-csr-approver-29556556-c9nft\" (UID: \"f5961bba-4ec3-4b4e-b5a2-73aa1024326e\") " pod="openshift-infra/auto-csr-approver-29556556-c9nft" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.840959 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/363c526f-c53a-49f8-88bc-823b1ccde350-csi-data-dir\") pod \"csi-hostpathplugin-md725\" (UID: \"363c526f-c53a-49f8-88bc-823b1ccde350\") " pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.841059 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86c4fd3a-5374-4b50-8f43-301205a71684-config-volume\") pod \"dns-default-bbjqp\" (UID: \"86c4fd3a-5374-4b50-8f43-301205a71684\") " pod="openshift-dns/dns-default-bbjqp" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.841105 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/363c526f-c53a-49f8-88bc-823b1ccde350-mountpoint-dir\") pod \"csi-hostpathplugin-md725\" (UID: \"363c526f-c53a-49f8-88bc-823b1ccde350\") " pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.841144 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a655e298-2051-4412-bbd7-b527adca30ff-node-bootstrap-token\") pod \"machine-config-server-dtk79\" (UID: \"a655e298-2051-4412-bbd7-b527adca30ff\") " pod="openshift-machine-config-operator/machine-config-server-dtk79" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.841165 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/363c526f-c53a-49f8-88bc-823b1ccde350-socket-dir\") pod \"csi-hostpathplugin-md725\" (UID: \"363c526f-c53a-49f8-88bc-823b1ccde350\") " pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.841199 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dccqh\" (UniqueName: \"kubernetes.io/projected/363c526f-c53a-49f8-88bc-823b1ccde350-kube-api-access-dccqh\") pod \"csi-hostpathplugin-md725\" (UID: \"363c526f-c53a-49f8-88bc-823b1ccde350\") " pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.841218 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a655e298-2051-4412-bbd7-b527adca30ff-certs\") pod \"machine-config-server-dtk79\" (UID: \"a655e298-2051-4412-bbd7-b527adca30ff\") " pod="openshift-machine-config-operator/machine-config-server-dtk79" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.841546 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mz6c\" (UniqueName: \"kubernetes.io/projected/a655e298-2051-4412-bbd7-b527adca30ff-kube-api-access-5mz6c\") pod \"machine-config-server-dtk79\" (UID: \"a655e298-2051-4412-bbd7-b527adca30ff\") " pod="openshift-machine-config-operator/machine-config-server-dtk79" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.841592 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/363c526f-c53a-49f8-88bc-823b1ccde350-registration-dir\") pod \"csi-hostpathplugin-md725\" (UID: \"363c526f-c53a-49f8-88bc-823b1ccde350\") " pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.841697 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86c4fd3a-5374-4b50-8f43-301205a71684-metrics-tls\") pod \"dns-default-bbjqp\" (UID: \"86c4fd3a-5374-4b50-8f43-301205a71684\") " pod="openshift-dns/dns-default-bbjqp" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.841890 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/363c526f-c53a-49f8-88bc-823b1ccde350-plugins-dir\") pod \"csi-hostpathplugin-md725\" (UID: \"363c526f-c53a-49f8-88bc-823b1ccde350\") " pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.842099 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/363c526f-c53a-49f8-88bc-823b1ccde350-csi-data-dir\") pod \"csi-hostpathplugin-md725\" (UID: \"363c526f-c53a-49f8-88bc-823b1ccde350\") " pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:01 crc kubenswrapper[4841]: E0313 09:16:01.842229 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:02.342217365 +0000 UTC m=+245.072117566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.842805 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/363c526f-c53a-49f8-88bc-823b1ccde350-mountpoint-dir\") pod \"csi-hostpathplugin-md725\" (UID: \"363c526f-c53a-49f8-88bc-823b1ccde350\") " pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.843250 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/363c526f-c53a-49f8-88bc-823b1ccde350-plugins-dir\") pod \"csi-hostpathplugin-md725\" (UID: \"363c526f-c53a-49f8-88bc-823b1ccde350\") " pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.843534 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/363c526f-c53a-49f8-88bc-823b1ccde350-registration-dir\") pod \"csi-hostpathplugin-md725\" (UID: \"363c526f-c53a-49f8-88bc-823b1ccde350\") " pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.843582 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/363c526f-c53a-49f8-88bc-823b1ccde350-socket-dir\") pod \"csi-hostpathplugin-md725\" (UID: \"363c526f-c53a-49f8-88bc-823b1ccde350\") " pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.843707 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86c4fd3a-5374-4b50-8f43-301205a71684-config-volume\") pod \"dns-default-bbjqp\" (UID: \"86c4fd3a-5374-4b50-8f43-301205a71684\") " pod="openshift-dns/dns-default-bbjqp" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.848490 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a655e298-2051-4412-bbd7-b527adca30ff-certs\") pod \"machine-config-server-dtk79\" (UID: \"a655e298-2051-4412-bbd7-b527adca30ff\") " pod="openshift-machine-config-operator/machine-config-server-dtk79" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.849449 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgzwx\" (UniqueName: \"kubernetes.io/projected/ba0466cd-0dd4-47b7-92cf-f445820086e4-kube-api-access-dgzwx\") pod \"ingress-canary-m4fpx\" (UID: \"ba0466cd-0dd4-47b7-92cf-f445820086e4\") " pod="openshift-ingress-canary/ingress-canary-m4fpx" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.851632 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wh6kv" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.855192 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a655e298-2051-4412-bbd7-b527adca30ff-node-bootstrap-token\") pod \"machine-config-server-dtk79\" (UID: \"a655e298-2051-4412-bbd7-b527adca30ff\") " pod="openshift-machine-config-operator/machine-config-server-dtk79" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.855457 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86c4fd3a-5374-4b50-8f43-301205a71684-metrics-tls\") pod \"dns-default-bbjqp\" (UID: \"86c4fd3a-5374-4b50-8f43-301205a71684\") " pod="openshift-dns/dns-default-bbjqp" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.865695 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4rt5\" (UniqueName: \"kubernetes.io/projected/4c9dbe40-1d26-4eec-b870-5ac342b005be-kube-api-access-h4rt5\") pod \"cluster-image-registry-operator-dc59b4c8b-q7t6x\" (UID: \"4c9dbe40-1d26-4eec-b870-5ac342b005be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.872338 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.888444 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtzvx\" (UniqueName: \"kubernetes.io/projected/1fcd85b1-ea32-49f2-a4a4-ed5b9995249f-kube-api-access-rtzvx\") pod \"ingress-operator-5b745b69d9-4d5n4\" (UID: \"1fcd85b1-ea32-49f2-a4a4-ed5b9995249f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.890424 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.914098 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/249068ea-529c-4fe4-b357-0b35a9cfd5d7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bvfhj\" (UID: \"249068ea-529c-4fe4-b357-0b35a9cfd5d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bvfhj" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.930366 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-bound-sa-token\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.942936 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:01 crc kubenswrapper[4841]: E0313 09:16:01.943711 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:02.443695268 +0000 UTC m=+245.173595459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.956106 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tzt8\" (UniqueName: \"kubernetes.io/projected/dd714e2b-d16f-4895-a438-b2f1b8235008-kube-api-access-7tzt8\") pod \"olm-operator-6b444d44fb-sbpds\" (UID: \"dd714e2b-d16f-4895-a438-b2f1b8235008\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.960175 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bvfhj" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.965665 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rm4kk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.981856 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s72k9\" (UniqueName: \"kubernetes.io/projected/8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f-kube-api-access-s72k9\") pod \"router-default-5444994796-8kvzk\" (UID: \"8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f\") " pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.981949 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds" Mar 13 09:16:01 crc kubenswrapper[4841]: I0313 09:16:01.987789 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9wrv\" (UniqueName: \"kubernetes.io/projected/344d8ef8-1693-4dcc-b09d-dc9fc7c041cc-kube-api-access-q9wrv\") pod \"auto-csr-approver-29556554-2dgrd\" (UID: \"344d8ef8-1693-4dcc-b09d-dc9fc7c041cc\") " pod="openshift-infra/auto-csr-approver-29556554-2dgrd" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.007610 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m4fpx" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.009483 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1fcd85b1-ea32-49f2-a4a4-ed5b9995249f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4d5n4\" (UID: \"1fcd85b1-ea32-49f2-a4a4-ed5b9995249f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.037361 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gt25\" (UniqueName: \"kubernetes.io/projected/9b8c5df2-d1ee-429f-a269-72122ee12baf-kube-api-access-9gt25\") pod \"machine-config-controller-84d6567774-6cdpl\" (UID: \"9b8c5df2-d1ee-429f-a269-72122ee12baf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cdpl" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.046773 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:02 crc kubenswrapper[4841]: E0313 09:16:02.047382 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:02.54736941 +0000 UTC m=+245.277269601 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.076688 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7czgg\" (UniqueName: \"kubernetes.io/projected/7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205-kube-api-access-7czgg\") pod \"catalog-operator-68c6474976-khk8q\" (UID: \"7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.079394 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvvb9\" (UniqueName: \"kubernetes.io/projected/1745fdf3-fbb9-4736-a4da-b534d8c208bd-kube-api-access-fvvb9\") pod \"collect-profiles-29556555-zbvz4\" (UID: \"1745fdf3-fbb9-4736-a4da-b534d8c208bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.096385 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c9dbe40-1d26-4eec-b870-5ac342b005be-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q7t6x\" (UID: \"4c9dbe40-1d26-4eec-b870-5ac342b005be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.107606 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzp5s\" (UniqueName: \"kubernetes.io/projected/ebf5c5f8-1870-470e-b63d-f84c99bdb936-kube-api-access-wzp5s\") pod \"dns-operator-744455d44c-qcfvl\" (UID: \"ebf5c5f8-1870-470e-b63d-f84c99bdb936\") " pod="openshift-dns-operator/dns-operator-744455d44c-qcfvl" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.107620 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4rfp"] Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.125142 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mkgp\" (UniqueName: \"kubernetes.io/projected/fe784c39-4b98-4b26-af66-d47a500ce697-kube-api-access-4mkgp\") pod \"package-server-manager-789f6589d5-xxwmv\" (UID: \"fe784c39-4b98-4b26-af66-d47a500ce697\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xxwmv" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.147357 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqbdf\" (UniqueName: \"kubernetes.io/projected/3f752e6f-1e5a-4694-80de-379a9c00ff96-kube-api-access-vqbdf\") pod \"service-ca-operator-777779d784-5rn52\" (UID: \"3f752e6f-1e5a-4694-80de-379a9c00ff96\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rn52" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.148060 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:02 crc kubenswrapper[4841]: E0313 09:16:02.149054 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:02.649035537 +0000 UTC m=+245.378935738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.161665 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qcfvl" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.163604 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.187396 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cdpl" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.191305 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-lrsts"] Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.197745 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptb5f\" (UniqueName: \"kubernetes.io/projected/4a2c849d-df3a-45e3-b868-859cec0f55a0-kube-api-access-ptb5f\") pod \"packageserver-d55dfcdfc-7kz2p\" (UID: \"4a2c849d-df3a-45e3-b868-859cec0f55a0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.198026 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74mht\" (UniqueName: \"kubernetes.io/projected/fe9244af-7a05-4be3-bc6a-b8430139ce47-kube-api-access-74mht\") pod \"migrator-59844c95c7-6lggf\" (UID: \"fe9244af-7a05-4be3-bc6a-b8430139ce47\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6lggf" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.198506 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xxwmv" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.206040 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.215580 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rqcwd"] Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.216561 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dnx\" (UniqueName: \"kubernetes.io/projected/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-kube-api-access-s2dnx\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.218426 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mhstq"] Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.221536 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zdgw8"] Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.225425 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556554-2dgrd" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.235151 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.249410 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.253690 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:02 crc kubenswrapper[4841]: E0313 09:16:02.254329 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:02.75431207 +0000 UTC m=+245.484212261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.262963 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f48vc\" (UniqueName: \"kubernetes.io/projected/86c4fd3a-5374-4b50-8f43-301205a71684-kube-api-access-f48vc\") pod \"dns-default-bbjqp\" (UID: \"86c4fd3a-5374-4b50-8f43-301205a71684\") " pod="openshift-dns/dns-default-bbjqp" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.268397 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mz6c\" (UniqueName: \"kubernetes.io/projected/a655e298-2051-4412-bbd7-b527adca30ff-kube-api-access-5mz6c\") pod \"machine-config-server-dtk79\" (UID: \"a655e298-2051-4412-bbd7-b527adca30ff\") " pod="openshift-machine-config-operator/machine-config-server-dtk79" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.272899 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.273431 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-v6vhc"] Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.283783 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dccqh\" (UniqueName: \"kubernetes.io/projected/363c526f-c53a-49f8-88bc-823b1ccde350-kube-api-access-dccqh\") pod \"csi-hostpathplugin-md725\" (UID: \"363c526f-c53a-49f8-88bc-823b1ccde350\") " pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.288207 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rn52" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.296558 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.303908 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpr2t\" (UniqueName: \"kubernetes.io/projected/f5961bba-4ec3-4b4e-b5a2-73aa1024326e-kube-api-access-wpr2t\") pod \"auto-csr-approver-29556556-c9nft\" (UID: \"f5961bba-4ec3-4b4e-b5a2-73aa1024326e\") " pod="openshift-infra/auto-csr-approver-29556556-c9nft" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.314643 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bbjqp" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.321124 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-md725" Mar 13 09:16:02 crc kubenswrapper[4841]: W0313 09:16:02.324560 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podede1a06d_6caf_41ae_acfa_2335821a2e0e.slice/crio-00ee6097072f7aba9421071eab0544d4a7b8d105314d914e234b60cf803440c7 WatchSource:0}: Error finding container 00ee6097072f7aba9421071eab0544d4a7b8d105314d914e234b60cf803440c7: Status 404 returned error can't find the container with id 00ee6097072f7aba9421071eab0544d4a7b8d105314d914e234b60cf803440c7 Mar 13 09:16:02 crc kubenswrapper[4841]: W0313 09:16:02.326257 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d322779_6593_48b0_b504_792c5ae3b7a6.slice/crio-d0cce1099a2c48c9192877322785f4295c6c527d56648837b2cb2d26fd4469aa WatchSource:0}: Error finding container d0cce1099a2c48c9192877322785f4295c6c527d56648837b2cb2d26fd4469aa: Status 404 returned error can't find the container with id d0cce1099a2c48c9192877322785f4295c6c527d56648837b2cb2d26fd4469aa Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.326375 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6lggf" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.341631 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dtk79" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.362848 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:02 crc kubenswrapper[4841]: E0313 09:16:02.363013 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:02.862987388 +0000 UTC m=+245.592887579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.363191 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:02 crc kubenswrapper[4841]: E0313 09:16:02.363752 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:02.863735461 +0000 UTC m=+245.593635652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:02 crc kubenswrapper[4841]: W0313 09:16:02.366656 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ec2bfef_c455_40b0_92a3_43f9fcd2bd8f.slice/crio-c48f3d4a744c94722a7b5d10115609e31adbc9baaa2cb621714fe1d26985da4b WatchSource:0}: Error finding container c48f3d4a744c94722a7b5d10115609e31adbc9baaa2cb621714fe1d26985da4b: Status 404 returned error can't find the container with id c48f3d4a744c94722a7b5d10115609e31adbc9baaa2cb621714fe1d26985da4b Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.463790 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:02 crc kubenswrapper[4841]: E0313 09:16:02.463933 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:02.963905693 +0000 UTC m=+245.693805884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.464322 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:02 crc kubenswrapper[4841]: E0313 09:16:02.464746 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:02.964735149 +0000 UTC m=+245.694635370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.568330 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:02 crc kubenswrapper[4841]: E0313 09:16:02.568662 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:03.068647237 +0000 UTC m=+245.798547428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.576670 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556556-c9nft" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.671799 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:02 crc kubenswrapper[4841]: E0313 09:16:02.672304 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:03.172291107 +0000 UTC m=+245.902191298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.687018 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wh6kv"] Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.697508 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vgm4n"] Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.774069 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:02 crc kubenswrapper[4841]: E0313 09:16:02.774569 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:03.274551315 +0000 UTC m=+246.004451496 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.790650 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8kvzk" event={"ID":"8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f","Type":"ContainerStarted","Data":"c48f3d4a744c94722a7b5d10115609e31adbc9baaa2cb621714fe1d26985da4b"} Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.792660 4841 generic.go:334] "Generic (PLEG): container finished" podID="b15787fb-bbd9-459b-bd8d-9f54eb62d8a3" containerID="f2d90fa0bbde0ad35aacc8a7f73feccd4935c1a6ebd5fd3831815096cf87ee6b" exitCode=0 Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.792703 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" event={"ID":"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3","Type":"ContainerDied","Data":"f2d90fa0bbde0ad35aacc8a7f73feccd4935c1a6ebd5fd3831815096cf87ee6b"} Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.793952 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zdgw8" event={"ID":"2151dc45-23c0-44b3-b896-0915da9f9d59","Type":"ContainerStarted","Data":"bb0f7699037ee302c18f8e3bc87e4d33e7b137f475d74f74b0c0e6cf11a4d8e7"} Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.798803 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dtk79" event={"ID":"a655e298-2051-4412-bbd7-b527adca30ff","Type":"ContainerStarted","Data":"ca0f3fd75470eafe6ea21475dc451a9a7e6c2ff88f65c158f75c5fe49938902e"} Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.801168 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lrsts" event={"ID":"ede1a06d-6caf-41ae-acfa-2335821a2e0e","Type":"ContainerStarted","Data":"00ee6097072f7aba9421071eab0544d4a7b8d105314d914e234b60cf803440c7"} Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.805155 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rqcwd" event={"ID":"8d322779-6593-48b0-b504-792c5ae3b7a6","Type":"ContainerStarted","Data":"d0cce1099a2c48c9192877322785f4295c6c527d56648837b2cb2d26fd4469aa"} Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.807517 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" event={"ID":"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5","Type":"ContainerStarted","Data":"aafb05d18a8b640517f336e7355d6246eb720b94e1dccb1088d33d4b60ecd5da"} Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.808931 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mhstq" event={"ID":"d27990e2-0de9-4cb5-a162-5949047f5d93","Type":"ContainerStarted","Data":"9096b9592437be70c1254a1e00452cb9a537d3b177c6d6b31e6b81ec40da2ff9"} Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.810592 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp" event={"ID":"fa351e74-995f-4273-b0c6-ed8f57932d7b","Type":"ContainerStarted","Data":"d792d2ec0b853bfb7065159f58b063577dd9c8056e9d87caf655beead4b35897"} Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.813475 4841 generic.go:334] "Generic (PLEG): container finished" podID="a38e6005-63a6-4733-a269-f355c648fed4" containerID="e65e265ec8906e61945806c9d7b2d982723db36d55a0addddc420c93edb88e28" exitCode=0 Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.813520 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-69l88" event={"ID":"a38e6005-63a6-4733-a269-f355c648fed4","Type":"ContainerDied","Data":"e65e265ec8906e61945806c9d7b2d982723db36d55a0addddc420c93edb88e28"} Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.813535 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-69l88" event={"ID":"a38e6005-63a6-4733-a269-f355c648fed4","Type":"ContainerStarted","Data":"9838337eb0f8cda83be31bd766b218823c757eeb8999695650130f9fd3cce7b0"} Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.823162 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" event={"ID":"293f3a0e-401d-440f-8321-0aac18b90219","Type":"ContainerStarted","Data":"fdb2201f290d04899f9d0a955db359bc157600c136bb62af54c243cc1c5c322a"} Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.824483 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.837707 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v6vhc" event={"ID":"60193856-8a3f-4ce0-b79d-44e58de19b06","Type":"ContainerStarted","Data":"00fc187cb3ba96d7fcaa5d61f31452043338a7de0e9d2814285f512a8d79f9c3"} Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.840600 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4rfp" event={"ID":"f83ab80b-b5bd-4274-99b9-b364923326bf","Type":"ContainerStarted","Data":"dd402f8cb3906d185b04ced8d79e4d0f984880a2edfd5ff77de3c1488abc412f"} Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.840634 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4rfp" event={"ID":"f83ab80b-b5bd-4274-99b9-b364923326bf","Type":"ContainerStarted","Data":"930724bbb9cdf877e987dfc86e709c7a9c742e6e747e79d8b996792ae080915d"} Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.841450 4841 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-zsbmr container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.841480 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" podUID="293f3a0e-401d-440f-8321-0aac18b90219" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.844211 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" event={"ID":"e98c46b0-e56f-4a29-a194-a39fc2401cfa","Type":"ContainerStarted","Data":"efb231d6fce1bc16e4c0d58245d721007a0eaeef91d0a131f658951e177853ad"} Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.844276 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" event={"ID":"e98c46b0-e56f-4a29-a194-a39fc2401cfa","Type":"ContainerStarted","Data":"2b83358b44b9fb5121c80e1f87383b45850d664e9f0affed82272dcaf660d33f"} Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.844668 4841 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-86p7z container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.844705 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" podUID="4d40b190-4e17-46d9-85f7-f4062ea2fc47" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.879511 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:02 crc kubenswrapper[4841]: E0313 09:16:02.891237 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:03.391220313 +0000 UTC m=+246.121120504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.944354 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bvfhj"] Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.946619 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c"] Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.982130 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:02 crc kubenswrapper[4841]: E0313 09:16:02.982550 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:03.482524837 +0000 UTC m=+246.212425028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:02 crc kubenswrapper[4841]: I0313 09:16:02.982926 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:03 crc kubenswrapper[4841]: E0313 09:16:03.008689 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:03.508668297 +0000 UTC m=+246.238568488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.095170 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:03 crc kubenswrapper[4841]: E0313 09:16:03.095656 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:03.595639234 +0000 UTC m=+246.325539425 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.106407 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-2sr9r" podStartSLOduration=176.106388621 podStartE2EDuration="2m56.106388621s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:03.054248956 +0000 UTC m=+245.784149217" watchObservedRunningTime="2026-03-13 09:16:03.106388621 +0000 UTC m=+245.836288812" Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.108667 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vjcp6"] Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.123572 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.164693 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" podStartSLOduration=176.16467507 podStartE2EDuration="2m56.16467507s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:03.129822147 +0000 UTC m=+245.859722338" watchObservedRunningTime="2026-03-13 09:16:03.16467507 +0000 UTC m=+245.894575251" Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.193977 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qlg79"] Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.197075 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:03 crc kubenswrapper[4841]: E0313 09:16:03.197455 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:03.697442617 +0000 UTC m=+246.427342808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.234509 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds"] Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.250376 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6t6zf"] Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.277622 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rm4kk"] Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.298075 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:03 crc kubenswrapper[4841]: E0313 09:16:03.298225 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:03.798204487 +0000 UTC m=+246.528104678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.298562 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:03 crc kubenswrapper[4841]: E0313 09:16:03.298966 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:03.798951321 +0000 UTC m=+246.528851512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.404512 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:03 crc kubenswrapper[4841]: E0313 09:16:03.405728 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:03.905710498 +0000 UTC m=+246.635610689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.441052 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" podStartSLOduration=176.441035357 podStartE2EDuration="2m56.441035357s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:03.404866443 +0000 UTC m=+246.134766634" watchObservedRunningTime="2026-03-13 09:16:03.441035357 +0000 UTC m=+246.170935548" Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.506676 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:03 crc kubenswrapper[4841]: E0313 09:16:03.506938 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:04.006927103 +0000 UTC m=+246.736827284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.541554 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-wxvhj" podStartSLOduration=176.541539599 podStartE2EDuration="2m56.541539599s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:03.541469867 +0000 UTC m=+246.271370048" watchObservedRunningTime="2026-03-13 09:16:03.541539599 +0000 UTC m=+246.271439790" Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.615794 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:03 crc kubenswrapper[4841]: E0313 09:16:03.616178 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:04.116161339 +0000 UTC m=+246.846061530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.673064 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556554-2dgrd"] Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.681320 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m4fpx"] Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.702774 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qcfvl"] Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.722772 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:03 crc kubenswrapper[4841]: E0313 09:16:03.723189 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:04.223178045 +0000 UTC m=+246.953078236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.732866 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4"] Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.743659 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.823538 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:03 crc kubenswrapper[4841]: E0313 09:16:03.823746 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:04.323722888 +0000 UTC m=+247.053623069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.824247 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:03 crc kubenswrapper[4841]: E0313 09:16:03.824589 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:04.324576215 +0000 UTC m=+247.054476406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.834932 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6cdpl"] Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.845823 4841 patch_prober.go:28] interesting pod/console-operator-58897d9998-2sr9r container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.845887 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2sr9r" podUID="3afaa466-07d9-4168-a4bc-4af2c328f8fe" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.6:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.862569 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x"] Mar 13 09:16:03 crc kubenswrapper[4841]: W0313 09:16:03.916908 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c9dbe40_1d26_4eec_b870_5ac342b005be.slice/crio-2042459af499f6a1e3efe7da520c46b6cfcecf862c4bbd8d8f688285411bf104 WatchSource:0}: Error finding container 2042459af499f6a1e3efe7da520c46b6cfcecf862c4bbd8d8f688285411bf104: Status 404 returned error can't find the container with id 2042459af499f6a1e3efe7da520c46b6cfcecf862c4bbd8d8f688285411bf104 Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.917108 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5rn52"] Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.917135 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rm4kk" event={"ID":"fff23c02-970f-41d9-9f60-ec0664b17186","Type":"ContainerStarted","Data":"134f78fab930d22af7829f8bf18011ceb6fa061f9c73be3b7e8701be3c5aba1f"} Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.926611 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:03 crc kubenswrapper[4841]: E0313 09:16:03.927419 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:04.42738785 +0000 UTC m=+247.157288031 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:03 crc kubenswrapper[4841]: W0313 09:16:03.941370 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b8c5df2_d1ee_429f_a269_72122ee12baf.slice/crio-20ad4643bb079876f6264eab5bfdda458bc0ab1d784bd06216aeb384bd51e8da WatchSource:0}: Error finding container 20ad4643bb079876f6264eab5bfdda458bc0ab1d784bd06216aeb384bd51e8da: Status 404 returned error can't find the container with id 20ad4643bb079876f6264eab5bfdda458bc0ab1d784bd06216aeb384bd51e8da Mar 13 09:16:03 crc kubenswrapper[4841]: W0313 09:16:03.949406 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f752e6f_1e5a_4694_80de_379a9c00ff96.slice/crio-f9518eaa13d2d3894b468e5061982dc4b0a815787d4fa4c46b47bb88b1100f2b WatchSource:0}: Error finding container f9518eaa13d2d3894b468e5061982dc4b0a815787d4fa4c46b47bb88b1100f2b: Status 404 returned error can't find the container with id f9518eaa13d2d3894b468e5061982dc4b0a815787d4fa4c46b47bb88b1100f2b Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.965598 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-69l88" event={"ID":"a38e6005-63a6-4733-a269-f355c648fed4","Type":"ContainerStarted","Data":"a6688b4676b7b5f3fc88797c7e2669559948a4539a74551def57b999ecc32633"} Mar 13 09:16:03 crc kubenswrapper[4841]: I0313 09:16:03.966348 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-69l88" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.007908 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qtpx7" podStartSLOduration=177.007890414 podStartE2EDuration="2m57.007890414s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:03.99561152 +0000 UTC m=+246.725511711" watchObservedRunningTime="2026-03-13 09:16:04.007890414 +0000 UTC m=+246.737790605" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.015898 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qcfvl" event={"ID":"ebf5c5f8-1870-470e-b63d-f84c99bdb936","Type":"ContainerStarted","Data":"28b00bdf4dd437a0a9e700156410d27500e02d25da2d42514084163b151f10a2"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.027985 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:04 crc kubenswrapper[4841]: E0313 09:16:04.029069 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:04.529057388 +0000 UTC m=+247.258957579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.039897 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6lggf"] Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.039950 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q"] Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.058035 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" event={"ID":"b15787fb-bbd9-459b-bd8d-9f54eb62d8a3","Type":"ContainerStarted","Data":"fa6ce094ed5b3e2237ba4c7bb217d9b13bd981488d9c2c2c9a706526b33e551e"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.072754 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-md725"] Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.095804 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zdgw8" event={"ID":"2151dc45-23c0-44b3-b896-0915da9f9d59","Type":"ContainerStarted","Data":"f785a63ee42b1ae7953b3c50e354d2c0ad4f01163ed63575f069ceee3ba431cb"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.095854 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zdgw8" event={"ID":"2151dc45-23c0-44b3-b896-0915da9f9d59","Type":"ContainerStarted","Data":"1894154d7a29af337bf357e8a2a44085be2f89e84a15e390d8a0f01bdd59582e"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.114473 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qlg79" event={"ID":"747a24f9-e654-421a-8da1-0be0aa6ccd9b","Type":"ContainerStarted","Data":"5612b287267740500e9df870f37c19135d457b5544b4fc2e3ff74421929f0fb5"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.129795 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:04 crc kubenswrapper[4841]: E0313 09:16:04.130915 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:04.630891552 +0000 UTC m=+247.360791813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.135865 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xxwmv"] Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.177842 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" podStartSLOduration=177.177823144 podStartE2EDuration="2m57.177823144s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:04.148232426 +0000 UTC m=+246.878132617" watchObservedRunningTime="2026-03-13 09:16:04.177823144 +0000 UTC m=+246.907723335" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.180022 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p"] Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.183232 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556556-c9nft"] Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.195303 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" podStartSLOduration=177.195282771 podStartE2EDuration="2m57.195282771s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:04.187956482 +0000 UTC m=+246.917856673" watchObservedRunningTime="2026-03-13 09:16:04.195282771 +0000 UTC m=+246.925182962" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.197466 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8kvzk" event={"ID":"8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f","Type":"ContainerStarted","Data":"cca61c6b696eccbfd32ef6dfeafc0f8f6bf2742c270c67b8f3bf3da3582cc6b6"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.208496 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.215574 4841 patch_prober.go:28] interesting pod/router-default-5444994796-8kvzk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 09:16:04 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Mar 13 09:16:04 crc kubenswrapper[4841]: [+]process-running ok Mar 13 09:16:04 crc kubenswrapper[4841]: healthz check failed Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.215627 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8kvzk" podUID="8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.216341 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mhstq" event={"ID":"d27990e2-0de9-4cb5-a162-5949047f5d93","Type":"ContainerStarted","Data":"617b3e31451194be4fffc722fd8cd16edca4be4c71e31c0d662d857a476ff3ce"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.234834 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:04 crc kubenswrapper[4841]: E0313 09:16:04.235886 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:04.735871555 +0000 UTC m=+247.465771746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.265114 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4"] Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.266357 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4" event={"ID":"1745fdf3-fbb9-4736-a4da-b534d8c208bd","Type":"ContainerStarted","Data":"722d1bd2bd758e29410698d60bad89f016b0e2587f1aaf15b9d0b500655e1d5b"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.284768 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bbjqp"] Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.285798 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vgm4n" event={"ID":"f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa","Type":"ContainerStarted","Data":"9f4b1a3e398fcdee0cccdc1f64bcacae2d75090cb0d7cbb3b49d8ddaf8074941"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.285845 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vgm4n" event={"ID":"f8c15ccf-dcff-4cb5-9bc3-02a20faafdfa","Type":"ContainerStarted","Data":"a77a00d53a2822b3ecb5f1e24d3e1c122d03d289236115d8674907dc4f54459a"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.299459 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bfpp" podStartSLOduration=177.299442808 podStartE2EDuration="2m57.299442808s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:04.297213108 +0000 UTC m=+247.027113289" watchObservedRunningTime="2026-03-13 09:16:04.299442808 +0000 UTC m=+247.029342999" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.322568 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rqcwd" event={"ID":"8d322779-6593-48b0-b504-792c5ae3b7a6","Type":"ContainerStarted","Data":"08b31510726a2d29da6462b88839ab929a0c9e87380cc6f51410c9786df6f0fe"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.341745 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:04 crc kubenswrapper[4841]: E0313 09:16:04.343066 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:04.843050646 +0000 UTC m=+247.572950837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.344837 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-hslbs" podStartSLOduration=177.344821821 podStartE2EDuration="2m57.344821821s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:04.325813995 +0000 UTC m=+247.055714186" watchObservedRunningTime="2026-03-13 09:16:04.344821821 +0000 UTC m=+247.074722012" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.365022 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" event={"ID":"fad88931-0cb1-40fd-b256-f9cd1c93a7e6","Type":"ContainerStarted","Data":"31b074416c805d3a73651dad8947d6445d7cfff8604bc59e7f1ada047e6df69e"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.365479 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.366156 4841 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6t6zf container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.366186 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" podUID="fad88931-0cb1-40fd-b256-f9cd1c93a7e6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.376472 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v6vhc" event={"ID":"60193856-8a3f-4ce0-b79d-44e58de19b06","Type":"ContainerStarted","Data":"7ebd0bf34b0eb12d099c0e9b051e9dcccfc5a04f9e683dae8b7ee332d28829f7"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.426743 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.426817 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.444961 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.446897 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-69l88" podStartSLOduration=177.446887713 podStartE2EDuration="2m57.446887713s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:04.435672311 +0000 UTC m=+247.165572502" watchObservedRunningTime="2026-03-13 09:16:04.446887713 +0000 UTC m=+247.176787904" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.447237 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q4rfp" podStartSLOduration=177.447232043 podStartE2EDuration="2m57.447232043s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:04.364741766 +0000 UTC m=+247.094641967" watchObservedRunningTime="2026-03-13 09:16:04.447232043 +0000 UTC m=+247.177132234" Mar 13 09:16:04 crc kubenswrapper[4841]: E0313 09:16:04.447510 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:04.947495161 +0000 UTC m=+247.677395352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.480627 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rqcwd" podStartSLOduration=177.48060748 podStartE2EDuration="2m57.48060748s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:04.479227836 +0000 UTC m=+247.209128027" watchObservedRunningTime="2026-03-13 09:16:04.48060748 +0000 UTC m=+247.210507671" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.488365 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lrsts" event={"ID":"ede1a06d-6caf-41ae-acfa-2335821a2e0e","Type":"ContainerStarted","Data":"02826389533ba5c4c8ea408928c315dedb0afef4c47373f37e67bb099c31f0f8"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.489350 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-lrsts" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.493015 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-lrsts container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.493044 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lrsts" podUID="ede1a06d-6caf-41ae-acfa-2335821a2e0e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.496362 4841 ???:1] "http: TLS handshake error from 192.168.126.11:40608: no serving certificate available for the kubelet" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.527591 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c" event={"ID":"f6103ec2-c668-4a2b-adfe-d31fb06bb5e3","Type":"ContainerStarted","Data":"0eefbe4eacc7073bcb8108332b0e7b38b579807fe036313ceb42fcf776760dc8"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.530648 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-v6vhc" podStartSLOduration=177.530634609 podStartE2EDuration="2m57.530634609s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:04.53035666 +0000 UTC m=+247.260256851" watchObservedRunningTime="2026-03-13 09:16:04.530634609 +0000 UTC m=+247.260534800" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.546986 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:04 crc kubenswrapper[4841]: E0313 09:16:04.547662 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:05.047641012 +0000 UTC m=+247.777541213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.548518 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wh6kv" event={"ID":"adeab6d7-21b6-4ef2-afdb-75854f0914c5","Type":"ContainerStarted","Data":"594c0a8f8474ef6c8a4766917d053053a175527d2a8ea77b7fef5132cf549706"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.548556 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wh6kv" event={"ID":"adeab6d7-21b6-4ef2-afdb-75854f0914c5","Type":"ContainerStarted","Data":"116b0584a822728162da70849aa583d84dc108b24d9dd30a0769a3aafbdc4e0b"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.560648 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dtk79" event={"ID":"a655e298-2051-4412-bbd7-b527adca30ff","Type":"ContainerStarted","Data":"9bd50df849c72ed185ea595dc45ca85e6aa00acd00ad8096dd86faf6dbbd665f"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.577560 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m4fpx" event={"ID":"ba0466cd-0dd4-47b7-92cf-f445820086e4","Type":"ContainerStarted","Data":"080c6fbe32c915ef29efac5113e32f957870efdfef94e75b77f46211deb09f6b"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.581556 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" podStartSLOduration=177.581535194 podStartE2EDuration="2m57.581535194s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:04.578939183 +0000 UTC m=+247.308839374" watchObservedRunningTime="2026-03-13 09:16:04.581535194 +0000 UTC m=+247.311435385" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.588195 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556554-2dgrd" event={"ID":"344d8ef8-1693-4dcc-b09d-dc9fc7c041cc","Type":"ContainerStarted","Data":"6bdd5ab6e513bd07ea184c77959d7010dcf67516ee7e4f829ec7f70fe4d1c99f"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.588951 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.617139 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vjcp6" event={"ID":"7a0b7635-6423-4716-93b4-c8e9b3012e55","Type":"ContainerStarted","Data":"575e34fcef13c0b13881d658cdb991178206e5b26138a9e6227bcd939a8f7e5e"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.632816 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-8kvzk" podStartSLOduration=177.632798442 podStartE2EDuration="2m57.632798442s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:04.63174494 +0000 UTC m=+247.361645141" watchObservedRunningTime="2026-03-13 09:16:04.632798442 +0000 UTC m=+247.362698623" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.641598 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds" event={"ID":"dd714e2b-d16f-4895-a438-b2f1b8235008","Type":"ContainerStarted","Data":"fadfb4603b021895a5118979633815dc82739511b409e3ab9cad7b17d8ef52e8"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.641638 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.654409 4841 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-sbpds container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.654460 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds" podUID="dd714e2b-d16f-4895-a438-b2f1b8235008" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.657691 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:04 crc kubenswrapper[4841]: E0313 09:16:04.658127 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:05.158116816 +0000 UTC m=+247.888017007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.658258 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mhstq" podStartSLOduration=177.658248251 podStartE2EDuration="2m57.658248251s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:04.657427855 +0000 UTC m=+247.387328046" watchObservedRunningTime="2026-03-13 09:16:04.658248251 +0000 UTC m=+247.388148442" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.671420 4841 ???:1] "http: TLS handshake error from 192.168.126.11:40618: no serving certificate available for the kubelet" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.671686 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bvfhj" event={"ID":"249068ea-529c-4fe4-b357-0b35a9cfd5d7","Type":"ContainerStarted","Data":"eac8ab446f57797b578be784291befde41a52b7f470660cc25bc2cdba0513113"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.671724 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bvfhj" event={"ID":"249068ea-529c-4fe4-b357-0b35a9cfd5d7","Type":"ContainerStarted","Data":"9975b3a89ecf8138f5608a295da40fea1d8200e03a78093131976c63d00bfda3"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.721065 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" event={"ID":"c205e4f8-77f0-43a5-8b2a-cccc1381ecd5","Type":"ContainerStarted","Data":"8488ba6af4df7a860cccb45238de5f3eabeaf04819cb466b153fe5bfdadb8f0e"} Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.748980 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.749316 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vgm4n" podStartSLOduration=177.749300267 podStartE2EDuration="2m57.749300267s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:04.711423318 +0000 UTC m=+247.441323509" watchObservedRunningTime="2026-03-13 09:16:04.749300267 +0000 UTC m=+247.479200458" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.750475 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-zdgw8" podStartSLOduration=177.750470633 podStartE2EDuration="2m57.750470633s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:04.749201694 +0000 UTC m=+247.479101885" watchObservedRunningTime="2026-03-13 09:16:04.750470633 +0000 UTC m=+247.480370824" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.760150 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:04 crc kubenswrapper[4841]: E0313 09:16:04.761611 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:05.261595972 +0000 UTC m=+247.991496153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.787341 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-lrsts" podStartSLOduration=177.787323539 podStartE2EDuration="2m57.787323539s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:04.787207735 +0000 UTC m=+247.517107926" watchObservedRunningTime="2026-03-13 09:16:04.787323539 +0000 UTC m=+247.517223730" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.828170 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-2sr9r" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.835240 4841 ???:1] "http: TLS handshake error from 192.168.126.11:40634: no serving certificate available for the kubelet" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.863009 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:04 crc kubenswrapper[4841]: E0313 09:16:04.863327 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:05.363314112 +0000 UTC m=+248.093214303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.872307 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" podStartSLOduration=177.872255823 podStartE2EDuration="2m57.872255823s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:04.870732394 +0000 UTC m=+247.600632575" watchObservedRunningTime="2026-03-13 09:16:04.872255823 +0000 UTC m=+247.602156014" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.920928 4841 ???:1] "http: TLS handshake error from 192.168.126.11:40650: no serving certificate available for the kubelet" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.928043 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds" podStartSLOduration=177.928025402 podStartE2EDuration="2m57.928025402s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:04.901677295 +0000 UTC m=+247.631577486" watchObservedRunningTime="2026-03-13 09:16:04.928025402 +0000 UTC m=+247.657925593" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.970225 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bvfhj" podStartSLOduration=177.970209625 podStartE2EDuration="2m57.970209625s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:04.969640277 +0000 UTC m=+247.699540468" watchObservedRunningTime="2026-03-13 09:16:04.970209625 +0000 UTC m=+247.700109806" Mar 13 09:16:04 crc kubenswrapper[4841]: I0313 09:16:04.971858 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:04 crc kubenswrapper[4841]: E0313 09:16:04.972165 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:05.472149585 +0000 UTC m=+248.202049776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.007950 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wh6kv" podStartSLOduration=178.007929707 podStartE2EDuration="2m58.007929707s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:04.999816653 +0000 UTC m=+247.729716844" watchObservedRunningTime="2026-03-13 09:16:05.007929707 +0000 UTC m=+247.737829898" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.026971 4841 ???:1] "http: TLS handshake error from 192.168.126.11:40662: no serving certificate available for the kubelet" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.075309 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:05 crc kubenswrapper[4841]: E0313 09:16:05.075567 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:05.575554788 +0000 UTC m=+248.305454979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.121393 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-dtk79" podStartSLOduration=6.121375085 podStartE2EDuration="6.121375085s" podCreationTimestamp="2026-03-13 09:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:05.121334914 +0000 UTC m=+247.851235095" watchObservedRunningTime="2026-03-13 09:16:05.121375085 +0000 UTC m=+247.851275276" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.143051 4841 ???:1] "http: TLS handshake error from 192.168.126.11:40668: no serving certificate available for the kubelet" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.176124 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:05 crc kubenswrapper[4841]: E0313 09:16:05.176500 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:05.676486264 +0000 UTC m=+248.406386445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.218566 4841 patch_prober.go:28] interesting pod/router-default-5444994796-8kvzk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 09:16:05 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Mar 13 09:16:05 crc kubenswrapper[4841]: [+]process-running ok Mar 13 09:16:05 crc kubenswrapper[4841]: healthz check failed Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.218870 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8kvzk" podUID="8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.280479 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:05 crc kubenswrapper[4841]: E0313 09:16:05.280910 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:05.780889298 +0000 UTC m=+248.510789489 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.298607 4841 ???:1] "http: TLS handshake error from 192.168.126.11:40680: no serving certificate available for the kubelet" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.381668 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:05 crc kubenswrapper[4841]: E0313 09:16:05.382111 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:05.882094792 +0000 UTC m=+248.611994983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.406950 4841 ???:1] "http: TLS handshake error from 192.168.126.11:40692: no serving certificate available for the kubelet" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.488802 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:05 crc kubenswrapper[4841]: E0313 09:16:05.489112 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:05.989100237 +0000 UTC m=+248.719000428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.495196 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-86p7z"] Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.495430 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" podUID="4d40b190-4e17-46d9-85f7-f4062ea2fc47" containerName="controller-manager" containerID="cri-o://9f45a5c900e6d258fa313fcb6c32460d268e1a72cf2206bfd94a5930bb6d0a78" gracePeriod=30 Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.512589 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.560881 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg"] Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.589571 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:05 crc kubenswrapper[4841]: E0313 09:16:05.589915 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:06.089891058 +0000 UTC m=+248.819791249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.690722 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:05 crc kubenswrapper[4841]: E0313 09:16:05.691081 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:06.191069951 +0000 UTC m=+248.920970142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.731211 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" event={"ID":"4a2c849d-df3a-45e3-b868-859cec0f55a0","Type":"ContainerStarted","Data":"2f5d86daab73ff9e444027de2e44b079cc88979bc1e8c6fa578209504afeedf0"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.731616 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" event={"ID":"4a2c849d-df3a-45e3-b868-859cec0f55a0","Type":"ContainerStarted","Data":"3b18512ea492b73bd0c2a68809d5928b0ff239f681129571707c45a19a3f3cd5"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.731930 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.733698 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds" event={"ID":"dd714e2b-d16f-4895-a438-b2f1b8235008","Type":"ContainerStarted","Data":"2ce16349bd04bc876b63dde3b2606b2666a476c3c815fb6072eaaaccf6dbc0fc"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.734159 4841 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-sbpds container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.734189 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds" podUID="dd714e2b-d16f-4895-a438-b2f1b8235008" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.736294 4841 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7kz2p container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.736536 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" podUID="4a2c849d-df3a-45e3-b868-859cec0f55a0" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.741624 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x" event={"ID":"4c9dbe40-1d26-4eec-b870-5ac342b005be","Type":"ContainerStarted","Data":"5cc16398ca0ce76cf753135facd3de26dd289e972c6b6497871cb00b1b159bea"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.741661 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x" event={"ID":"4c9dbe40-1d26-4eec-b870-5ac342b005be","Type":"ContainerStarted","Data":"2042459af499f6a1e3efe7da520c46b6cfcecf862c4bbd8d8f688285411bf104"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.755733 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4" event={"ID":"1fcd85b1-ea32-49f2-a4a4-ed5b9995249f","Type":"ContainerStarted","Data":"46190c4265fbb61618e7982ba8bab03191683435db0933641cb94ebe5ea32021"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.755774 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4" event={"ID":"1fcd85b1-ea32-49f2-a4a4-ed5b9995249f","Type":"ContainerStarted","Data":"273e5ee0166578a0e864596de9fe141496b39076685504e34f626122e8745633"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.755783 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4" event={"ID":"1fcd85b1-ea32-49f2-a4a4-ed5b9995249f","Type":"ContainerStarted","Data":"3810130890eace0a5e389ff1bcc3d907790afb11a43eebf1ebc7b3e9a1957a01"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.770591 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rm4kk" event={"ID":"fff23c02-970f-41d9-9f60-ec0664b17186","Type":"ContainerStarted","Data":"cedaf9c5a739cedb96c19d5b08b41c39ff1d54ddf12f577e7d405f12b70d3c82"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.797054 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.797294 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6lggf" event={"ID":"fe9244af-7a05-4be3-bc6a-b8430139ce47","Type":"ContainerStarted","Data":"b21f825fd1eb9b7462cf20bfd5b3e50a38165b869d67d79a1e30bc678a515fad"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.797329 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6lggf" event={"ID":"fe9244af-7a05-4be3-bc6a-b8430139ce47","Type":"ContainerStarted","Data":"4832cffc179bd6933544fee992887439dca4d806c582d0985a00e75537a420e6"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.797344 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6lggf" event={"ID":"fe9244af-7a05-4be3-bc6a-b8430139ce47","Type":"ContainerStarted","Data":"bbcfdd6c3eac67a98b946e7b268deab2b985bf7f01c305bc0ff93128cd019eda"} Mar 13 09:16:05 crc kubenswrapper[4841]: E0313 09:16:05.797505 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:06.297486739 +0000 UTC m=+249.027386930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.798119 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:05 crc kubenswrapper[4841]: E0313 09:16:05.799872 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:06.299858983 +0000 UTC m=+249.029759174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.809783 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" podStartSLOduration=178.809760754 podStartE2EDuration="2m58.809760754s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:05.776836992 +0000 UTC m=+248.506737183" watchObservedRunningTime="2026-03-13 09:16:05.809760754 +0000 UTC m=+248.539660945" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.822959 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-md725" event={"ID":"363c526f-c53a-49f8-88bc-823b1ccde350","Type":"ContainerStarted","Data":"148fa48dbcfb9af4eb0556d8005f2e35e8ab0505758c4d04e7dc22b0285ed057"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.827843 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-rm4kk" podStartSLOduration=178.8278104 podStartE2EDuration="2m58.8278104s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:05.826594942 +0000 UTC m=+248.556495143" watchObservedRunningTime="2026-03-13 09:16:05.8278104 +0000 UTC m=+248.557710591" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.828380 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7t6x" podStartSLOduration=178.828372398 podStartE2EDuration="2m58.828372398s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:05.802293349 +0000 UTC m=+248.532193560" watchObservedRunningTime="2026-03-13 09:16:05.828372398 +0000 UTC m=+248.558272599" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.852672 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" event={"ID":"fad88931-0cb1-40fd-b256-f9cd1c93a7e6","Type":"ContainerStarted","Data":"ca8d720134276e45f89eb1988d60c9dd00b24ad319c7c2a11b282c50d94c6a0a"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.853621 4841 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6t6zf container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.853669 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" podUID="fad88931-0cb1-40fd-b256-f9cd1c93a7e6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.867791 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4" event={"ID":"1745fdf3-fbb9-4736-a4da-b534d8c208bd","Type":"ContainerStarted","Data":"c140fa68774d5707fef40bcf227291d5b8cf288c1244f6933833bc2b04e31f7a"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.870639 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6lggf" podStartSLOduration=178.870618843 podStartE2EDuration="2m58.870618843s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:05.863883371 +0000 UTC m=+248.593783562" watchObservedRunningTime="2026-03-13 09:16:05.870618843 +0000 UTC m=+248.600519034" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.871049 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d5n4" podStartSLOduration=178.871044036 podStartE2EDuration="2m58.871044036s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:05.846665442 +0000 UTC m=+248.576565633" watchObservedRunningTime="2026-03-13 09:16:05.871044036 +0000 UTC m=+248.600944227" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.881659 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4" podStartSLOduration=65.881639568 podStartE2EDuration="1m5.881639568s" podCreationTimestamp="2026-03-13 09:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:05.881405031 +0000 UTC m=+248.611305232" watchObservedRunningTime="2026-03-13 09:16:05.881639568 +0000 UTC m=+248.611539759" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.883965 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c" event={"ID":"f6103ec2-c668-4a2b-adfe-d31fb06bb5e3","Type":"ContainerStarted","Data":"ebbd10e6becf2efc593a9f26e275807db7c98610189ff78b32d19e7a99a3eccf"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.884012 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c" event={"ID":"f6103ec2-c668-4a2b-adfe-d31fb06bb5e3","Type":"ContainerStarted","Data":"b0a4801bfb1f9be72b7db4ec0253ff44adf0a09f5d7b02c871b40c4f93464a3c"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.897131 4841 generic.go:334] "Generic (PLEG): container finished" podID="4d40b190-4e17-46d9-85f7-f4062ea2fc47" containerID="9f45a5c900e6d258fa313fcb6c32460d268e1a72cf2206bfd94a5930bb6d0a78" exitCode=0 Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.897205 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" event={"ID":"4d40b190-4e17-46d9-85f7-f4062ea2fc47","Type":"ContainerDied","Data":"9f45a5c900e6d258fa313fcb6c32460d268e1a72cf2206bfd94a5930bb6d0a78"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.899157 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:05 crc kubenswrapper[4841]: E0313 09:16:05.899422 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:06.399400526 +0000 UTC m=+249.129300747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.899560 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.901442 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q" event={"ID":"7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205","Type":"ContainerStarted","Data":"d21d29045ae8cfca06f4cb626819f8f34c563c58bba0d71357d5496962a2b3d3"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.901485 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q" event={"ID":"7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205","Type":"ContainerStarted","Data":"01819729c8929ec160b171025939cdb01db447e6c29c404072bea5364ee1cd56"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.901920 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.904235 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ckw4c" podStartSLOduration=178.904226377 podStartE2EDuration="2m58.904226377s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:05.902746261 +0000 UTC m=+248.632646452" watchObservedRunningTime="2026-03-13 09:16:05.904226377 +0000 UTC m=+248.634126568" Mar 13 09:16:05 crc kubenswrapper[4841]: E0313 09:16:05.905498 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:06.405486106 +0000 UTC m=+249.135386357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.907466 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qlg79" event={"ID":"747a24f9-e654-421a-8da1-0be0aa6ccd9b","Type":"ContainerStarted","Data":"e94103cbf5b588e9374d522d4fb72bf8d311edd594b2419c2a40c81d6166c99c"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.907503 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qlg79" event={"ID":"747a24f9-e654-421a-8da1-0be0aa6ccd9b","Type":"ContainerStarted","Data":"920685d8f6cc2ed7689191eff90b16b2ac0e91bea60028129e207d998685b36a"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.915542 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556556-c9nft" event={"ID":"f5961bba-4ec3-4b4e-b5a2-73aa1024326e","Type":"ContainerStarted","Data":"23f6237f50f75bf08a38f50dae37c774516c746360ea9ba7dd28d0cdd2bdae37"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.938966 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bbjqp" event={"ID":"86c4fd3a-5374-4b50-8f43-301205a71684","Type":"ContainerStarted","Data":"c8ac87963dbcbdb501717fa2f1896e434a3ddcc38682ae3224f310f85767620a"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.939009 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bbjqp" event={"ID":"86c4fd3a-5374-4b50-8f43-301205a71684","Type":"ContainerStarted","Data":"7edf4fad9273644f6f95a1d47ebeb589e386abbb2e816eba3b535cdbefc17a52"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.942777 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q" podStartSLOduration=178.942750954 podStartE2EDuration="2m58.942750954s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:05.932209914 +0000 UTC m=+248.662110115" watchObservedRunningTime="2026-03-13 09:16:05.942750954 +0000 UTC m=+248.672651155" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.943035 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.944226 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.959192 4841 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-khk8q container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.959692 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q" podUID="7676cbd6-d5e1-4ee8-a6ef-7ccf646d5205" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.960997 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m4fpx" event={"ID":"ba0466cd-0dd4-47b7-92cf-f445820086e4","Type":"ContainerStarted","Data":"4b508b1fae8fdb60930ad90e6cc3229cd436d190432cb81bd0b831cf087f2faf"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.966724 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-qlg79" podStartSLOduration=178.966712157 podStartE2EDuration="2m58.966712157s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:05.964043283 +0000 UTC m=+248.693943484" watchObservedRunningTime="2026-03-13 09:16:05.966712157 +0000 UTC m=+248.696612348" Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.986099 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xxwmv" event={"ID":"fe784c39-4b98-4b26-af66-d47a500ce697","Type":"ContainerStarted","Data":"88eb2fd661d655cbdb9ba6468b4a05a36d99234a0e45f197aa1cb70f93e1430b"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.986136 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xxwmv" event={"ID":"fe784c39-4b98-4b26-af66-d47a500ce697","Type":"ContainerStarted","Data":"14a29c8d55941098a6e82bd24de476a564420c7c67f2d14a154607797ee7580f"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.986149 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xxwmv" event={"ID":"fe784c39-4b98-4b26-af66-d47a500ce697","Type":"ContainerStarted","Data":"b9b358d97c0192303196a2c9cdee8f6f67ab52367fce6effa08467b34a6f79a9"} Mar 13 09:16:05 crc kubenswrapper[4841]: I0313 09:16:05.986896 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xxwmv" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.001082 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.027366 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-m4fpx" podStartSLOduration=7.027347368 podStartE2EDuration="7.027347368s" podCreationTimestamp="2026-03-13 09:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:06.012415949 +0000 UTC m=+248.742316140" watchObservedRunningTime="2026-03-13 09:16:06.027347368 +0000 UTC m=+248.757247559" Mar 13 09:16:06 crc kubenswrapper[4841]: E0313 09:16:06.030854 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:06.530831427 +0000 UTC m=+249.260731618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.037182 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.038337 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vjcp6" event={"ID":"7a0b7635-6423-4716-93b4-c8e9b3012e55","Type":"ContainerStarted","Data":"5e0d84540103e2b332b88ee64a785bad332bad41d5b601dbce2c6b214ce932c1"} Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.038366 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vjcp6" event={"ID":"7a0b7635-6423-4716-93b4-c8e9b3012e55","Type":"ContainerStarted","Data":"094f878f341fe75b9fcac4911bc64b9958b3552c6d46f1c6e22ecba7a26b8428"} Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.050839 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qcfvl" event={"ID":"ebf5c5f8-1870-470e-b63d-f84c99bdb936","Type":"ContainerStarted","Data":"860e5d960e6c2949f9a385e7f0ccb34126e111ab93a92978b9dfdd4df9023cda"} Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.050840 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xxwmv" podStartSLOduration=179.050827515 podStartE2EDuration="2m59.050827515s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:06.050100122 +0000 UTC m=+248.780000313" watchObservedRunningTime="2026-03-13 09:16:06.050827515 +0000 UTC m=+248.780727696" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.073787 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vjcp6" podStartSLOduration=179.073771503 podStartE2EDuration="2m59.073771503s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:06.072413221 +0000 UTC m=+248.802313412" watchObservedRunningTime="2026-03-13 09:16:06.073771503 +0000 UTC m=+248.803671694" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.075321 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cdpl" event={"ID":"9b8c5df2-d1ee-429f-a269-72122ee12baf","Type":"ContainerStarted","Data":"57c50089919d530009006070e5b4380d17e1caa78a3deab263101da0fa35f9c2"} Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.075372 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cdpl" event={"ID":"9b8c5df2-d1ee-429f-a269-72122ee12baf","Type":"ContainerStarted","Data":"985ca74725582be09ec88a99fe0c074661fc33aafdb3f129b48bf7c90aa78aa9"} Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.075383 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cdpl" event={"ID":"9b8c5df2-d1ee-429f-a269-72122ee12baf","Type":"ContainerStarted","Data":"20ad4643bb079876f6264eab5bfdda458bc0ab1d784bd06216aeb384bd51e8da"} Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.100950 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rn52" event={"ID":"3f752e6f-1e5a-4694-80de-379a9c00ff96","Type":"ContainerStarted","Data":"689356463a5ccca8d43f9b3dda690adcf6e1854c4a89f6977f306b85a80b4592"} Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.100983 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rn52" event={"ID":"3f752e6f-1e5a-4694-80de-379a9c00ff96","Type":"ContainerStarted","Data":"f9518eaa13d2d3894b468e5061982dc4b0a815787d4fa4c46b47bb88b1100f2b"} Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.102746 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-lrsts container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.102780 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lrsts" podUID="ede1a06d-6caf-41ae-acfa-2335821a2e0e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.103446 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" podUID="2faa4f36-0851-4da7-bd9a-dd0e50daf0de" containerName="route-controller-manager" containerID="cri-o://f826a54b397d39eb84d511dc376f98ae2fb7bfbcb32a70ac3600a26d2f0ff8f7" gracePeriod=30 Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.103578 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d40b190-4e17-46d9-85f7-f4062ea2fc47-proxy-ca-bundles\") pod \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.103619 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d40b190-4e17-46d9-85f7-f4062ea2fc47-client-ca\") pod \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.103646 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d40b190-4e17-46d9-85f7-f4062ea2fc47-serving-cert\") pod \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.103699 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d40b190-4e17-46d9-85f7-f4062ea2fc47-config\") pod \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.103781 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24dm2\" (UniqueName: \"kubernetes.io/projected/4d40b190-4e17-46d9-85f7-f4062ea2fc47-kube-api-access-24dm2\") pod \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\" (UID: \"4d40b190-4e17-46d9-85f7-f4062ea2fc47\") " Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.104023 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:06 crc kubenswrapper[4841]: E0313 09:16:06.106239 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:06.606227752 +0000 UTC m=+249.336127933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.107434 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d40b190-4e17-46d9-85f7-f4062ea2fc47-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4d40b190-4e17-46d9-85f7-f4062ea2fc47" (UID: "4d40b190-4e17-46d9-85f7-f4062ea2fc47"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.108133 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d40b190-4e17-46d9-85f7-f4062ea2fc47-config" (OuterVolumeSpecName: "config") pod "4d40b190-4e17-46d9-85f7-f4062ea2fc47" (UID: "4d40b190-4e17-46d9-85f7-f4062ea2fc47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.108440 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d40b190-4e17-46d9-85f7-f4062ea2fc47-client-ca" (OuterVolumeSpecName: "client-ca") pod "4d40b190-4e17-46d9-85f7-f4062ea2fc47" (UID: "4d40b190-4e17-46d9-85f7-f4062ea2fc47"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.135770 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d40b190-4e17-46d9-85f7-f4062ea2fc47-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4d40b190-4e17-46d9-85f7-f4062ea2fc47" (UID: "4d40b190-4e17-46d9-85f7-f4062ea2fc47"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.138492 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d40b190-4e17-46d9-85f7-f4062ea2fc47-kube-api-access-24dm2" (OuterVolumeSpecName: "kube-api-access-24dm2") pod "4d40b190-4e17-46d9-85f7-f4062ea2fc47" (UID: "4d40b190-4e17-46d9-85f7-f4062ea2fc47"). InnerVolumeSpecName "kube-api-access-24dm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.145360 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-qcfvl" podStartSLOduration=179.145340458 podStartE2EDuration="2m59.145340458s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:06.136234153 +0000 UTC m=+248.866134344" watchObservedRunningTime="2026-03-13 09:16:06.145340458 +0000 UTC m=+248.875240649" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.147336 4841 ???:1] "http: TLS handshake error from 192.168.126.11:40698: no serving certificate available for the kubelet" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.167611 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rn52" podStartSLOduration=179.167593686 podStartE2EDuration="2m59.167593686s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:06.164824629 +0000 UTC m=+248.894724830" watchObservedRunningTime="2026-03-13 09:16:06.167593686 +0000 UTC m=+248.897493877" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.189919 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.190220 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.194390 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cdpl" podStartSLOduration=179.194372226 podStartE2EDuration="2m59.194372226s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:06.192681843 +0000 UTC m=+248.922582044" watchObservedRunningTime="2026-03-13 09:16:06.194372226 +0000 UTC m=+248.924272417" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.208069 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.209010 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d40b190-4e17-46d9-85f7-f4062ea2fc47-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.209024 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24dm2\" (UniqueName: \"kubernetes.io/projected/4d40b190-4e17-46d9-85f7-f4062ea2fc47-kube-api-access-24dm2\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.209034 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d40b190-4e17-46d9-85f7-f4062ea2fc47-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.209042 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d40b190-4e17-46d9-85f7-f4062ea2fc47-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.209051 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d40b190-4e17-46d9-85f7-f4062ea2fc47-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:06 crc kubenswrapper[4841]: E0313 09:16:06.209762 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:06.709747108 +0000 UTC m=+249.439647299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.225391 4841 patch_prober.go:28] interesting pod/router-default-5444994796-8kvzk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 09:16:06 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Mar 13 09:16:06 crc kubenswrapper[4841]: [+]process-running ok Mar 13 09:16:06 crc kubenswrapper[4841]: healthz check failed Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.225438 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8kvzk" podUID="8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.275205 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-69l88" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.309914 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:06 crc kubenswrapper[4841]: E0313 09:16:06.310277 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:06.81024914 +0000 UTC m=+249.540149331 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.410585 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:06 crc kubenswrapper[4841]: E0313 09:16:06.410909 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:06.910878686 +0000 UTC m=+249.640778877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.411000 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:06 crc kubenswrapper[4841]: E0313 09:16:06.411447 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:06.911434564 +0000 UTC m=+249.641334745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.512030 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:06 crc kubenswrapper[4841]: E0313 09:16:06.512451 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:07.012433011 +0000 UTC m=+249.742333202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.615466 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:06 crc kubenswrapper[4841]: E0313 09:16:06.616060 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:07.116048871 +0000 UTC m=+249.845949062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.718038 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:06 crc kubenswrapper[4841]: E0313 09:16:06.718377 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:07.218362649 +0000 UTC m=+249.948262840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.820859 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:06 crc kubenswrapper[4841]: E0313 09:16:06.821233 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:07.321221775 +0000 UTC m=+250.051121966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.823611 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.853636 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.922766 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5knd\" (UniqueName: \"kubernetes.io/projected/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-kube-api-access-n5knd\") pod \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\" (UID: \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\") " Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.923116 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-serving-cert\") pod \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\" (UID: \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\") " Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.923221 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.923253 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-client-ca\") pod \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\" (UID: \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\") " Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.923325 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-config\") pod \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\" (UID: \"2faa4f36-0851-4da7-bd9a-dd0e50daf0de\") " Mar 13 09:16:06 crc kubenswrapper[4841]: E0313 09:16:06.923973 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:07.423951147 +0000 UTC m=+250.153851328 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.924296 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-client-ca" (OuterVolumeSpecName: "client-ca") pod "2faa4f36-0851-4da7-bd9a-dd0e50daf0de" (UID: "2faa4f36-0851-4da7-bd9a-dd0e50daf0de"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.924479 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-config" (OuterVolumeSpecName: "config") pod "2faa4f36-0851-4da7-bd9a-dd0e50daf0de" (UID: "2faa4f36-0851-4da7-bd9a-dd0e50daf0de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.933062 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-845b9c77f6-lw7jf"] Mar 13 09:16:06 crc kubenswrapper[4841]: E0313 09:16:06.935565 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d40b190-4e17-46d9-85f7-f4062ea2fc47" containerName="controller-manager" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.935681 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d40b190-4e17-46d9-85f7-f4062ea2fc47" containerName="controller-manager" Mar 13 09:16:06 crc kubenswrapper[4841]: E0313 09:16:06.935769 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2faa4f36-0851-4da7-bd9a-dd0e50daf0de" containerName="route-controller-manager" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.935827 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2faa4f36-0851-4da7-bd9a-dd0e50daf0de" containerName="route-controller-manager" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.935972 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="2faa4f36-0851-4da7-bd9a-dd0e50daf0de" containerName="route-controller-manager" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.936043 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d40b190-4e17-46d9-85f7-f4062ea2fc47" containerName="controller-manager" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.936441 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.939647 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-kube-api-access-n5knd" (OuterVolumeSpecName: "kube-api-access-n5knd") pod "2faa4f36-0851-4da7-bd9a-dd0e50daf0de" (UID: "2faa4f36-0851-4da7-bd9a-dd0e50daf0de"). InnerVolumeSpecName "kube-api-access-n5knd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.940088 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2faa4f36-0851-4da7-bd9a-dd0e50daf0de" (UID: "2faa4f36-0851-4da7-bd9a-dd0e50daf0de"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.949017 4841 patch_prober.go:28] interesting pod/apiserver-76f77b778f-nl9wf container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 13 09:16:06 crc kubenswrapper[4841]: [+]log ok Mar 13 09:16:06 crc kubenswrapper[4841]: [+]etcd ok Mar 13 09:16:06 crc kubenswrapper[4841]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 13 09:16:06 crc kubenswrapper[4841]: [+]poststarthook/generic-apiserver-start-informers ok Mar 13 09:16:06 crc kubenswrapper[4841]: [+]poststarthook/max-in-flight-filter ok Mar 13 09:16:06 crc kubenswrapper[4841]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 13 09:16:06 crc kubenswrapper[4841]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 13 09:16:06 crc kubenswrapper[4841]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 13 09:16:06 crc kubenswrapper[4841]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 13 09:16:06 crc kubenswrapper[4841]: [+]poststarthook/project.openshift.io-projectcache ok Mar 13 09:16:06 crc kubenswrapper[4841]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 13 09:16:06 crc kubenswrapper[4841]: [+]poststarthook/openshift.io-startinformers ok Mar 13 09:16:06 crc kubenswrapper[4841]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 13 09:16:06 crc kubenswrapper[4841]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 13 09:16:06 crc kubenswrapper[4841]: livez check failed Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.949057 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" podUID="c205e4f8-77f0-43a5-8b2a-cccc1381ecd5" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.985506 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-845b9c77f6-lw7jf"] Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.988276 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f"] Mar 13 09:16:06 crc kubenswrapper[4841]: I0313 09:16:06.989250 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.004612 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f"] Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.030197 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl9p5\" (UniqueName: \"kubernetes.io/projected/55eac427-cd92-4af6-bb8d-be8a54bbb69f-kube-api-access-fl9p5\") pod \"controller-manager-845b9c77f6-lw7jf\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.030245 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/55eac427-cd92-4af6-bb8d-be8a54bbb69f-proxy-ca-bundles\") pod \"controller-manager-845b9c77f6-lw7jf\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.030380 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55eac427-cd92-4af6-bb8d-be8a54bbb69f-serving-cert\") pod \"controller-manager-845b9c77f6-lw7jf\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.030405 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.030485 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55eac427-cd92-4af6-bb8d-be8a54bbb69f-client-ca\") pod \"controller-manager-845b9c77f6-lw7jf\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:07 crc kubenswrapper[4841]: E0313 09:16:07.030732 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:07.530719456 +0000 UTC m=+250.260619647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.030894 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55eac427-cd92-4af6-bb8d-be8a54bbb69f-config\") pod \"controller-manager-845b9c77f6-lw7jf\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.030952 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5knd\" (UniqueName: \"kubernetes.io/projected/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-kube-api-access-n5knd\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.030962 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.030992 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.031001 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2faa4f36-0851-4da7-bd9a-dd0e50daf0de-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.134786 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.134929 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/55eac427-cd92-4af6-bb8d-be8a54bbb69f-proxy-ca-bundles\") pod \"controller-manager-845b9c77f6-lw7jf\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.134958 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz2mr\" (UniqueName: \"kubernetes.io/projected/78895b4a-b94c-4138-a2eb-14066f4c08e2-kube-api-access-xz2mr\") pod \"route-controller-manager-78bbbbbf4b-w6p8f\" (UID: \"78895b4a-b94c-4138-a2eb-14066f4c08e2\") " pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.134984 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78895b4a-b94c-4138-a2eb-14066f4c08e2-serving-cert\") pod \"route-controller-manager-78bbbbbf4b-w6p8f\" (UID: \"78895b4a-b94c-4138-a2eb-14066f4c08e2\") " pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.135021 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78895b4a-b94c-4138-a2eb-14066f4c08e2-client-ca\") pod \"route-controller-manager-78bbbbbf4b-w6p8f\" (UID: \"78895b4a-b94c-4138-a2eb-14066f4c08e2\") " pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.135042 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55eac427-cd92-4af6-bb8d-be8a54bbb69f-serving-cert\") pod \"controller-manager-845b9c77f6-lw7jf\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.135096 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78895b4a-b94c-4138-a2eb-14066f4c08e2-config\") pod \"route-controller-manager-78bbbbbf4b-w6p8f\" (UID: \"78895b4a-b94c-4138-a2eb-14066f4c08e2\") " pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.135114 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55eac427-cd92-4af6-bb8d-be8a54bbb69f-client-ca\") pod \"controller-manager-845b9c77f6-lw7jf\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.135150 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55eac427-cd92-4af6-bb8d-be8a54bbb69f-config\") pod \"controller-manager-845b9c77f6-lw7jf\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.135174 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl9p5\" (UniqueName: \"kubernetes.io/projected/55eac427-cd92-4af6-bb8d-be8a54bbb69f-kube-api-access-fl9p5\") pod \"controller-manager-845b9c77f6-lw7jf\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:07 crc kubenswrapper[4841]: E0313 09:16:07.135529 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:07.635499061 +0000 UTC m=+250.365399252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.136697 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55eac427-cd92-4af6-bb8d-be8a54bbb69f-config\") pod \"controller-manager-845b9c77f6-lw7jf\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.136869 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55eac427-cd92-4af6-bb8d-be8a54bbb69f-client-ca\") pod \"controller-manager-845b9c77f6-lw7jf\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.137435 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/55eac427-cd92-4af6-bb8d-be8a54bbb69f-proxy-ca-bundles\") pod \"controller-manager-845b9c77f6-lw7jf\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.141606 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55eac427-cd92-4af6-bb8d-be8a54bbb69f-serving-cert\") pod \"controller-manager-845b9c77f6-lw7jf\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.171472 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl9p5\" (UniqueName: \"kubernetes.io/projected/55eac427-cd92-4af6-bb8d-be8a54bbb69f-kube-api-access-fl9p5\") pod \"controller-manager-845b9c77f6-lw7jf\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.181907 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bbjqp" event={"ID":"86c4fd3a-5374-4b50-8f43-301205a71684","Type":"ContainerStarted","Data":"30ced80e6ff0c56beb2f58f5ef659d282d3e0734d2d6aae250ec9c1fd9640e66"} Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.182839 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-bbjqp" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.217469 4841 patch_prober.go:28] interesting pod/router-default-5444994796-8kvzk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 09:16:07 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Mar 13 09:16:07 crc kubenswrapper[4841]: [+]process-running ok Mar 13 09:16:07 crc kubenswrapper[4841]: healthz check failed Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.217555 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8kvzk" podUID="8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.217593 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-md725" event={"ID":"363c526f-c53a-49f8-88bc-823b1ccde350","Type":"ContainerStarted","Data":"80961be4cab226d343cc50ed93a313163737cdebae4d643466c3080c23e5cf43"} Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.219027 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bbjqp" podStartSLOduration=8.21900828 podStartE2EDuration="8.21900828s" podCreationTimestamp="2026-03-13 09:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:07.218251647 +0000 UTC m=+249.948151848" watchObservedRunningTime="2026-03-13 09:16:07.21900828 +0000 UTC m=+249.948908471" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.238429 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78895b4a-b94c-4138-a2eb-14066f4c08e2-client-ca\") pod \"route-controller-manager-78bbbbbf4b-w6p8f\" (UID: \"78895b4a-b94c-4138-a2eb-14066f4c08e2\") " pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.238503 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.238558 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78895b4a-b94c-4138-a2eb-14066f4c08e2-config\") pod \"route-controller-manager-78bbbbbf4b-w6p8f\" (UID: \"78895b4a-b94c-4138-a2eb-14066f4c08e2\") " pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.238640 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz2mr\" (UniqueName: \"kubernetes.io/projected/78895b4a-b94c-4138-a2eb-14066f4c08e2-kube-api-access-xz2mr\") pod \"route-controller-manager-78bbbbbf4b-w6p8f\" (UID: \"78895b4a-b94c-4138-a2eb-14066f4c08e2\") " pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.238677 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78895b4a-b94c-4138-a2eb-14066f4c08e2-serving-cert\") pod \"route-controller-manager-78bbbbbf4b-w6p8f\" (UID: \"78895b4a-b94c-4138-a2eb-14066f4c08e2\") " pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" Mar 13 09:16:07 crc kubenswrapper[4841]: E0313 09:16:07.239160 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:07.739143722 +0000 UTC m=+250.469043913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.240207 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78895b4a-b94c-4138-a2eb-14066f4c08e2-client-ca\") pod \"route-controller-manager-78bbbbbf4b-w6p8f\" (UID: \"78895b4a-b94c-4138-a2eb-14066f4c08e2\") " pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.249429 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78895b4a-b94c-4138-a2eb-14066f4c08e2-config\") pod \"route-controller-manager-78bbbbbf4b-w6p8f\" (UID: \"78895b4a-b94c-4138-a2eb-14066f4c08e2\") " pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.250192 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78895b4a-b94c-4138-a2eb-14066f4c08e2-serving-cert\") pod \"route-controller-manager-78bbbbbf4b-w6p8f\" (UID: \"78895b4a-b94c-4138-a2eb-14066f4c08e2\") " pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.261814 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" event={"ID":"4d40b190-4e17-46d9-85f7-f4062ea2fc47","Type":"ContainerDied","Data":"44bfdbfe697bc6dc67f0783b9470e52d36a6c5b8fe169cc700f83d05639e7700"} Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.261866 4841 scope.go:117] "RemoveContainer" containerID="9f45a5c900e6d258fa313fcb6c32460d268e1a72cf2206bfd94a5930bb6d0a78" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.261868 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-86p7z" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.278217 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qcfvl" event={"ID":"ebf5c5f8-1870-470e-b63d-f84c99bdb936","Type":"ContainerStarted","Data":"8bb43796523c503215bd3f2fb06002781322c55e7e0117d0df723bc1ce4e78e2"} Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.284605 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.287088 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz2mr\" (UniqueName: \"kubernetes.io/projected/78895b4a-b94c-4138-a2eb-14066f4c08e2-kube-api-access-xz2mr\") pod \"route-controller-manager-78bbbbbf4b-w6p8f\" (UID: \"78895b4a-b94c-4138-a2eb-14066f4c08e2\") " pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.295168 4841 generic.go:334] "Generic (PLEG): container finished" podID="2faa4f36-0851-4da7-bd9a-dd0e50daf0de" containerID="f826a54b397d39eb84d511dc376f98ae2fb7bfbcb32a70ac3600a26d2f0ff8f7" exitCode=0 Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.295683 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.297688 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" event={"ID":"2faa4f36-0851-4da7-bd9a-dd0e50daf0de","Type":"ContainerDied","Data":"f826a54b397d39eb84d511dc376f98ae2fb7bfbcb32a70ac3600a26d2f0ff8f7"} Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.297725 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg" event={"ID":"2faa4f36-0851-4da7-bd9a-dd0e50daf0de","Type":"ContainerDied","Data":"b755895c7d38a660945ac664b5c678f6c59f28f8df7a0cad6ced2a31ccdf6f23"} Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.305559 4841 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6t6zf container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.305595 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-lrsts container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.305605 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" podUID="fad88931-0cb1-40fd-b256-f9cd1c93a7e6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.305635 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lrsts" podUID="ede1a06d-6caf-41ae-acfa-2335821a2e0e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.310398 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.315518 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khk8q" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.320218 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sbpds" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.321103 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ld4cm" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.341253 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:07 crc kubenswrapper[4841]: E0313 09:16:07.342403 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:07.842378499 +0000 UTC m=+250.572278690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.347343 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-86p7z"] Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.352510 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-86p7z"] Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.370480 4841 scope.go:117] "RemoveContainer" containerID="f826a54b397d39eb84d511dc376f98ae2fb7bfbcb32a70ac3600a26d2f0ff8f7" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.444767 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:07 crc kubenswrapper[4841]: E0313 09:16:07.447808 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:07.947792436 +0000 UTC m=+250.677692627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.448742 4841 scope.go:117] "RemoveContainer" containerID="f826a54b397d39eb84d511dc376f98ae2fb7bfbcb32a70ac3600a26d2f0ff8f7" Mar 13 09:16:07 crc kubenswrapper[4841]: E0313 09:16:07.465788 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f826a54b397d39eb84d511dc376f98ae2fb7bfbcb32a70ac3600a26d2f0ff8f7\": container with ID starting with f826a54b397d39eb84d511dc376f98ae2fb7bfbcb32a70ac3600a26d2f0ff8f7 not found: ID does not exist" containerID="f826a54b397d39eb84d511dc376f98ae2fb7bfbcb32a70ac3600a26d2f0ff8f7" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.465832 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f826a54b397d39eb84d511dc376f98ae2fb7bfbcb32a70ac3600a26d2f0ff8f7"} err="failed to get container status \"f826a54b397d39eb84d511dc376f98ae2fb7bfbcb32a70ac3600a26d2f0ff8f7\": rpc error: code = NotFound desc = could not find container \"f826a54b397d39eb84d511dc376f98ae2fb7bfbcb32a70ac3600a26d2f0ff8f7\": container with ID starting with f826a54b397d39eb84d511dc376f98ae2fb7bfbcb32a70ac3600a26d2f0ff8f7 not found: ID does not exist" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.469112 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg"] Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.484484 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zw4tg"] Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.497656 4841 ???:1] "http: TLS handshake error from 192.168.126.11:40702: no serving certificate available for the kubelet" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.547333 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:07 crc kubenswrapper[4841]: E0313 09:16:07.547712 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:08.047697108 +0000 UTC m=+250.777597299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.650796 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:07 crc kubenswrapper[4841]: E0313 09:16:07.651342 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:08.151325848 +0000 UTC m=+250.881226039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.736510 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7kz2p" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.754693 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:07 crc kubenswrapper[4841]: E0313 09:16:07.755050 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:08.255035801 +0000 UTC m=+250.984935982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.805030 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-845b9c77f6-lw7jf"] Mar 13 09:16:07 crc kubenswrapper[4841]: W0313 09:16:07.850811 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55eac427_cd92_4af6_bb8d_be8a54bbb69f.slice/crio-41a3577e44e39d21dae34ca9a6767209c8482f7af2f789f73e4d84d4692d15cf WatchSource:0}: Error finding container 41a3577e44e39d21dae34ca9a6767209c8482f7af2f789f73e4d84d4692d15cf: Status 404 returned error can't find the container with id 41a3577e44e39d21dae34ca9a6767209c8482f7af2f789f73e4d84d4692d15cf Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.856903 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.857061 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f"] Mar 13 09:16:07 crc kubenswrapper[4841]: E0313 09:16:07.857279 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:08.357251886 +0000 UTC m=+251.087152077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:07 crc kubenswrapper[4841]: W0313 09:16:07.882256 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78895b4a_b94c_4138_a2eb_14066f4c08e2.slice/crio-6a3c0e93065725f8c8b3dec8b09a178c0ad9777130eae301a40c21fb6912d4df WatchSource:0}: Error finding container 6a3c0e93065725f8c8b3dec8b09a178c0ad9777130eae301a40c21fb6912d4df: Status 404 returned error can't find the container with id 6a3c0e93065725f8c8b3dec8b09a178c0ad9777130eae301a40c21fb6912d4df Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.957729 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:07 crc kubenswrapper[4841]: E0313 09:16:07.957908 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:08.457882232 +0000 UTC m=+251.187782423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:07 crc kubenswrapper[4841]: I0313 09:16:07.958064 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:07 crc kubenswrapper[4841]: E0313 09:16:07.958468 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:08.45845307 +0000 UTC m=+251.188353261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.010606 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2faa4f36-0851-4da7-bd9a-dd0e50daf0de" path="/var/lib/kubelet/pods/2faa4f36-0851-4da7-bd9a-dd0e50daf0de/volumes" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.011288 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d40b190-4e17-46d9-85f7-f4062ea2fc47" path="/var/lib/kubelet/pods/4d40b190-4e17-46d9-85f7-f4062ea2fc47/volumes" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.062369 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:08 crc kubenswrapper[4841]: E0313 09:16:08.062714 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:08.5626992 +0000 UTC m=+251.292599391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.164113 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:08 crc kubenswrapper[4841]: E0313 09:16:08.164496 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:08.664465482 +0000 UTC m=+251.394365673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.209777 4841 patch_prober.go:28] interesting pod/router-default-5444994796-8kvzk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 09:16:08 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Mar 13 09:16:08 crc kubenswrapper[4841]: [+]process-running ok Mar 13 09:16:08 crc kubenswrapper[4841]: healthz check failed Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.209820 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8kvzk" podUID="8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.218207 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sbx6r"] Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.219470 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sbx6r" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.224164 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.229161 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sbx6r"] Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.266179 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:08 crc kubenswrapper[4841]: E0313 09:16:08.266802 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:08.76678638 +0000 UTC m=+251.496686571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.307196 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" event={"ID":"78895b4a-b94c-4138-a2eb-14066f4c08e2","Type":"ContainerStarted","Data":"0918dbc19a22c179e6c0b3f7ea75596c2206c255d9ed43bc42418794d35b53c6"} Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.307242 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" event={"ID":"78895b4a-b94c-4138-a2eb-14066f4c08e2","Type":"ContainerStarted","Data":"6a3c0e93065725f8c8b3dec8b09a178c0ad9777130eae301a40c21fb6912d4df"} Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.307630 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.320958 4841 generic.go:334] "Generic (PLEG): container finished" podID="1745fdf3-fbb9-4736-a4da-b534d8c208bd" containerID="c140fa68774d5707fef40bcf227291d5b8cf288c1244f6933833bc2b04e31f7a" exitCode=0 Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.321020 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4" event={"ID":"1745fdf3-fbb9-4736-a4da-b534d8c208bd","Type":"ContainerDied","Data":"c140fa68774d5707fef40bcf227291d5b8cf288c1244f6933833bc2b04e31f7a"} Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.332696 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" event={"ID":"55eac427-cd92-4af6-bb8d-be8a54bbb69f","Type":"ContainerStarted","Data":"caf341180a02d40996f5d28f986b0973d1f01ccf867cd341125b8c548b24f718"} Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.332736 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" event={"ID":"55eac427-cd92-4af6-bb8d-be8a54bbb69f","Type":"ContainerStarted","Data":"41a3577e44e39d21dae34ca9a6767209c8482f7af2f789f73e4d84d4692d15cf"} Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.333477 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.335400 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" podStartSLOduration=2.335391622 podStartE2EDuration="2.335391622s" podCreationTimestamp="2026-03-13 09:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:08.33181287 +0000 UTC m=+251.061713051" watchObservedRunningTime="2026-03-13 09:16:08.335391622 +0000 UTC m=+251.065291813" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.355460 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.367185 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbd9953a-618b-4cd2-806b-c01e07c40fc2-catalog-content\") pod \"community-operators-sbx6r\" (UID: \"cbd9953a-618b-4cd2-806b-c01e07c40fc2\") " pod="openshift-marketplace/community-operators-sbx6r" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.367307 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.367385 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jg9k\" (UniqueName: \"kubernetes.io/projected/cbd9953a-618b-4cd2-806b-c01e07c40fc2-kube-api-access-7jg9k\") pod \"community-operators-sbx6r\" (UID: \"cbd9953a-618b-4cd2-806b-c01e07c40fc2\") " pod="openshift-marketplace/community-operators-sbx6r" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.367419 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbd9953a-618b-4cd2-806b-c01e07c40fc2-utilities\") pod \"community-operators-sbx6r\" (UID: \"cbd9953a-618b-4cd2-806b-c01e07c40fc2\") " pod="openshift-marketplace/community-operators-sbx6r" Mar 13 09:16:08 crc kubenswrapper[4841]: E0313 09:16:08.367929 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:08.867915112 +0000 UTC m=+251.597815303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.369735 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" podStartSLOduration=2.369714458 podStartE2EDuration="2.369714458s" podCreationTimestamp="2026-03-13 09:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:08.367511259 +0000 UTC m=+251.097411450" watchObservedRunningTime="2026-03-13 09:16:08.369714458 +0000 UTC m=+251.099614649" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.419616 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l6qx4"] Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.420595 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6qx4" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.423444 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.444155 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l6qx4"] Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.468024 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:08 crc kubenswrapper[4841]: E0313 09:16:08.468350 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:08.968220447 +0000 UTC m=+251.698120638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.468438 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms9xf\" (UniqueName: \"kubernetes.io/projected/81521243-f7e1-4360-9b12-8047988a69dd-kube-api-access-ms9xf\") pod \"certified-operators-l6qx4\" (UID: \"81521243-f7e1-4360-9b12-8047988a69dd\") " pod="openshift-marketplace/certified-operators-l6qx4" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.468818 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jg9k\" (UniqueName: \"kubernetes.io/projected/cbd9953a-618b-4cd2-806b-c01e07c40fc2-kube-api-access-7jg9k\") pod \"community-operators-sbx6r\" (UID: \"cbd9953a-618b-4cd2-806b-c01e07c40fc2\") " pod="openshift-marketplace/community-operators-sbx6r" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.468859 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbd9953a-618b-4cd2-806b-c01e07c40fc2-utilities\") pod \"community-operators-sbx6r\" (UID: \"cbd9953a-618b-4cd2-806b-c01e07c40fc2\") " pod="openshift-marketplace/community-operators-sbx6r" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.469200 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81521243-f7e1-4360-9b12-8047988a69dd-utilities\") pod \"certified-operators-l6qx4\" (UID: \"81521243-f7e1-4360-9b12-8047988a69dd\") " pod="openshift-marketplace/certified-operators-l6qx4" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.469337 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbd9953a-618b-4cd2-806b-c01e07c40fc2-catalog-content\") pod \"community-operators-sbx6r\" (UID: \"cbd9953a-618b-4cd2-806b-c01e07c40fc2\") " pod="openshift-marketplace/community-operators-sbx6r" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.469451 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81521243-f7e1-4360-9b12-8047988a69dd-catalog-content\") pod \"certified-operators-l6qx4\" (UID: \"81521243-f7e1-4360-9b12-8047988a69dd\") " pod="openshift-marketplace/certified-operators-l6qx4" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.469489 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbd9953a-618b-4cd2-806b-c01e07c40fc2-utilities\") pod \"community-operators-sbx6r\" (UID: \"cbd9953a-618b-4cd2-806b-c01e07c40fc2\") " pod="openshift-marketplace/community-operators-sbx6r" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.469969 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbd9953a-618b-4cd2-806b-c01e07c40fc2-catalog-content\") pod \"community-operators-sbx6r\" (UID: \"cbd9953a-618b-4cd2-806b-c01e07c40fc2\") " pod="openshift-marketplace/community-operators-sbx6r" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.494998 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jg9k\" (UniqueName: \"kubernetes.io/projected/cbd9953a-618b-4cd2-806b-c01e07c40fc2-kube-api-access-7jg9k\") pod \"community-operators-sbx6r\" (UID: \"cbd9953a-618b-4cd2-806b-c01e07c40fc2\") " pod="openshift-marketplace/community-operators-sbx6r" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.545577 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sbx6r" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.574244 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81521243-f7e1-4360-9b12-8047988a69dd-catalog-content\") pod \"certified-operators-l6qx4\" (UID: \"81521243-f7e1-4360-9b12-8047988a69dd\") " pod="openshift-marketplace/certified-operators-l6qx4" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.574326 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.574372 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms9xf\" (UniqueName: \"kubernetes.io/projected/81521243-f7e1-4360-9b12-8047988a69dd-kube-api-access-ms9xf\") pod \"certified-operators-l6qx4\" (UID: \"81521243-f7e1-4360-9b12-8047988a69dd\") " pod="openshift-marketplace/certified-operators-l6qx4" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.574423 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81521243-f7e1-4360-9b12-8047988a69dd-utilities\") pod \"certified-operators-l6qx4\" (UID: \"81521243-f7e1-4360-9b12-8047988a69dd\") " pod="openshift-marketplace/certified-operators-l6qx4" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.574737 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81521243-f7e1-4360-9b12-8047988a69dd-catalog-content\") pod \"certified-operators-l6qx4\" (UID: \"81521243-f7e1-4360-9b12-8047988a69dd\") " pod="openshift-marketplace/certified-operators-l6qx4" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.574751 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81521243-f7e1-4360-9b12-8047988a69dd-utilities\") pod \"certified-operators-l6qx4\" (UID: \"81521243-f7e1-4360-9b12-8047988a69dd\") " pod="openshift-marketplace/certified-operators-l6qx4" Mar 13 09:16:08 crc kubenswrapper[4841]: E0313 09:16:08.574987 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:09.074975375 +0000 UTC m=+251.804875566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.594448 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms9xf\" (UniqueName: \"kubernetes.io/projected/81521243-f7e1-4360-9b12-8047988a69dd-kube-api-access-ms9xf\") pod \"certified-operators-l6qx4\" (UID: \"81521243-f7e1-4360-9b12-8047988a69dd\") " pod="openshift-marketplace/certified-operators-l6qx4" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.625960 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d4lvs"] Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.626874 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4lvs" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.641348 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d4lvs"] Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.677821 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:08 crc kubenswrapper[4841]: E0313 09:16:08.678171 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:09.178155491 +0000 UTC m=+251.908055682 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.701851 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.736289 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6qx4" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.778982 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.779077 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/517b279c-79ee-438c-83c2-ae5cd25848fc-utilities\") pod \"community-operators-d4lvs\" (UID: \"517b279c-79ee-438c-83c2-ae5cd25848fc\") " pod="openshift-marketplace/community-operators-d4lvs" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.779104 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql8rv\" (UniqueName: \"kubernetes.io/projected/517b279c-79ee-438c-83c2-ae5cd25848fc-kube-api-access-ql8rv\") pod \"community-operators-d4lvs\" (UID: \"517b279c-79ee-438c-83c2-ae5cd25848fc\") " pod="openshift-marketplace/community-operators-d4lvs" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.779150 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/517b279c-79ee-438c-83c2-ae5cd25848fc-catalog-content\") pod \"community-operators-d4lvs\" (UID: \"517b279c-79ee-438c-83c2-ae5cd25848fc\") " pod="openshift-marketplace/community-operators-d4lvs" Mar 13 09:16:08 crc kubenswrapper[4841]: E0313 09:16:08.779433 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:09.279419607 +0000 UTC m=+252.009319798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.831476 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wlqmq"] Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.837162 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlqmq" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.874049 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wlqmq"] Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.884233 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.884824 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/517b279c-79ee-438c-83c2-ae5cd25848fc-catalog-content\") pod \"community-operators-d4lvs\" (UID: \"517b279c-79ee-438c-83c2-ae5cd25848fc\") " pod="openshift-marketplace/community-operators-d4lvs" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.884938 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/517b279c-79ee-438c-83c2-ae5cd25848fc-utilities\") pod \"community-operators-d4lvs\" (UID: \"517b279c-79ee-438c-83c2-ae5cd25848fc\") " pod="openshift-marketplace/community-operators-d4lvs" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.884965 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql8rv\" (UniqueName: \"kubernetes.io/projected/517b279c-79ee-438c-83c2-ae5cd25848fc-kube-api-access-ql8rv\") pod \"community-operators-d4lvs\" (UID: \"517b279c-79ee-438c-83c2-ae5cd25848fc\") " pod="openshift-marketplace/community-operators-d4lvs" Mar 13 09:16:08 crc kubenswrapper[4841]: E0313 09:16:08.885393 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:09.38537853 +0000 UTC m=+252.115278721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.885993 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/517b279c-79ee-438c-83c2-ae5cd25848fc-catalog-content\") pod \"community-operators-d4lvs\" (UID: \"517b279c-79ee-438c-83c2-ae5cd25848fc\") " pod="openshift-marketplace/community-operators-d4lvs" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.886203 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/517b279c-79ee-438c-83c2-ae5cd25848fc-utilities\") pod \"community-operators-d4lvs\" (UID: \"517b279c-79ee-438c-83c2-ae5cd25848fc\") " pod="openshift-marketplace/community-operators-d4lvs" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.924183 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql8rv\" (UniqueName: \"kubernetes.io/projected/517b279c-79ee-438c-83c2-ae5cd25848fc-kube-api-access-ql8rv\") pod \"community-operators-d4lvs\" (UID: \"517b279c-79ee-438c-83c2-ae5cd25848fc\") " pod="openshift-marketplace/community-operators-d4lvs" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.950465 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4lvs" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.987201 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxzfr\" (UniqueName: \"kubernetes.io/projected/7f7766c0-82b7-4ee0-878d-47c5b0217e4d-kube-api-access-gxzfr\") pod \"certified-operators-wlqmq\" (UID: \"7f7766c0-82b7-4ee0-878d-47c5b0217e4d\") " pod="openshift-marketplace/certified-operators-wlqmq" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.987256 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f7766c0-82b7-4ee0-878d-47c5b0217e4d-utilities\") pod \"certified-operators-wlqmq\" (UID: \"7f7766c0-82b7-4ee0-878d-47c5b0217e4d\") " pod="openshift-marketplace/certified-operators-wlqmq" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.987328 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:08 crc kubenswrapper[4841]: I0313 09:16:08.987365 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f7766c0-82b7-4ee0-878d-47c5b0217e4d-catalog-content\") pod \"certified-operators-wlqmq\" (UID: \"7f7766c0-82b7-4ee0-878d-47c5b0217e4d\") " pod="openshift-marketplace/certified-operators-wlqmq" Mar 13 09:16:08 crc kubenswrapper[4841]: E0313 09:16:08.987675 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:09.487663458 +0000 UTC m=+252.217563649 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.050792 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.051522 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.055411 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.057205 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.057496 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.088650 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.088996 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxzfr\" (UniqueName: \"kubernetes.io/projected/7f7766c0-82b7-4ee0-878d-47c5b0217e4d-kube-api-access-gxzfr\") pod \"certified-operators-wlqmq\" (UID: \"7f7766c0-82b7-4ee0-878d-47c5b0217e4d\") " pod="openshift-marketplace/certified-operators-wlqmq" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.089049 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f7766c0-82b7-4ee0-878d-47c5b0217e4d-utilities\") pod \"certified-operators-wlqmq\" (UID: \"7f7766c0-82b7-4ee0-878d-47c5b0217e4d\") " pod="openshift-marketplace/certified-operators-wlqmq" Mar 13 09:16:09 crc kubenswrapper[4841]: E0313 09:16:09.089550 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:09.589524492 +0000 UTC m=+252.319424713 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.089665 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.089783 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f7766c0-82b7-4ee0-878d-47c5b0217e4d-catalog-content\") pod \"certified-operators-wlqmq\" (UID: \"7f7766c0-82b7-4ee0-878d-47c5b0217e4d\") " pod="openshift-marketplace/certified-operators-wlqmq" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.089443 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f7766c0-82b7-4ee0-878d-47c5b0217e4d-utilities\") pod \"certified-operators-wlqmq\" (UID: \"7f7766c0-82b7-4ee0-878d-47c5b0217e4d\") " pod="openshift-marketplace/certified-operators-wlqmq" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.090462 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f7766c0-82b7-4ee0-878d-47c5b0217e4d-catalog-content\") pod \"certified-operators-wlqmq\" (UID: \"7f7766c0-82b7-4ee0-878d-47c5b0217e4d\") " pod="openshift-marketplace/certified-operators-wlqmq" Mar 13 09:16:09 crc kubenswrapper[4841]: E0313 09:16:09.090526 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:09.590512104 +0000 UTC m=+252.320412385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.112809 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxzfr\" (UniqueName: \"kubernetes.io/projected/7f7766c0-82b7-4ee0-878d-47c5b0217e4d-kube-api-access-gxzfr\") pod \"certified-operators-wlqmq\" (UID: \"7f7766c0-82b7-4ee0-878d-47c5b0217e4d\") " pod="openshift-marketplace/certified-operators-wlqmq" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.165615 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sbx6r"] Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.190838 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.191379 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6119bb42-3ca0-46bf-a9f3-e270897e718b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6119bb42-3ca0-46bf-a9f3-e270897e718b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.191444 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6119bb42-3ca0-46bf-a9f3-e270897e718b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6119bb42-3ca0-46bf-a9f3-e270897e718b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 09:16:09 crc kubenswrapper[4841]: E0313 09:16:09.191544 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:09.691528012 +0000 UTC m=+252.421428203 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.210633 4841 patch_prober.go:28] interesting pod/router-default-5444994796-8kvzk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 09:16:09 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Mar 13 09:16:09 crc kubenswrapper[4841]: [+]process-running ok Mar 13 09:16:09 crc kubenswrapper[4841]: healthz check failed Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.210686 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8kvzk" podUID="8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.239666 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlqmq" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.281598 4841 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.292882 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6119bb42-3ca0-46bf-a9f3-e270897e718b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6119bb42-3ca0-46bf-a9f3-e270897e718b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.292923 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.293001 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6119bb42-3ca0-46bf-a9f3-e270897e718b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6119bb42-3ca0-46bf-a9f3-e270897e718b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.293328 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6119bb42-3ca0-46bf-a9f3-e270897e718b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6119bb42-3ca0-46bf-a9f3-e270897e718b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 09:16:09 crc kubenswrapper[4841]: E0313 09:16:09.293552 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:09.79354143 +0000 UTC m=+252.523441611 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.312174 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6119bb42-3ca0-46bf-a9f3-e270897e718b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6119bb42-3ca0-46bf-a9f3-e270897e718b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.353678 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-md725" event={"ID":"363c526f-c53a-49f8-88bc-823b1ccde350","Type":"ContainerStarted","Data":"22bbcb9cc4f9343b6dc21d6ffe977a8286d3f1d1ab4c090d8736433533e5b845"} Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.353717 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-md725" event={"ID":"363c526f-c53a-49f8-88bc-823b1ccde350","Type":"ContainerStarted","Data":"b2891a12a575e86b1d3b7c41e7000cdf1ceed875a87f1885225dabf2d5d86b10"} Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.358705 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbx6r" event={"ID":"cbd9953a-618b-4cd2-806b-c01e07c40fc2","Type":"ContainerStarted","Data":"0e99cd7f3765101b92e057d0b1c37498848898e366aa210e26e4c66ca12cd683"} Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.383575 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d4lvs"] Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.393767 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:09 crc kubenswrapper[4841]: E0313 09:16:09.394411 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:09.894390003 +0000 UTC m=+252.624290194 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.412517 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.446792 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l6qx4"] Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.495606 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:09 crc kubenswrapper[4841]: E0313 09:16:09.496642 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:09.99663075 +0000 UTC m=+252.726530941 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:09 crc kubenswrapper[4841]: W0313 09:16:09.519463 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81521243_f7e1_4360_9b12_8047988a69dd.slice/crio-4623e2663d41561119cbb5d6845674d032df2a7ae7abc324b61b0b46cc3a946b WatchSource:0}: Error finding container 4623e2663d41561119cbb5d6845674d032df2a7ae7abc324b61b0b46cc3a946b: Status 404 returned error can't find the container with id 4623e2663d41561119cbb5d6845674d032df2a7ae7abc324b61b0b46cc3a946b Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.598460 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:09 crc kubenswrapper[4841]: E0313 09:16:09.598751 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:10.098736682 +0000 UTC m=+252.828636863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.699404 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:09 crc kubenswrapper[4841]: E0313 09:16:09.700160 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:10.200145732 +0000 UTC m=+252.930045923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.763825 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wlqmq"] Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.777137 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.800775 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:09 crc kubenswrapper[4841]: E0313 09:16:09.800981 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:10.300948304 +0000 UTC m=+253.030848495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.801067 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:09 crc kubenswrapper[4841]: E0313 09:16:09.801507 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:10.301492821 +0000 UTC m=+253.031393012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:09 crc kubenswrapper[4841]: W0313 09:16:09.806112 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f7766c0_82b7_4ee0_878d_47c5b0217e4d.slice/crio-937c6c449a7b90f48c3a6a435efd8ce8bd0b1dabd0c3c5f0636214322226bf19 WatchSource:0}: Error finding container 937c6c449a7b90f48c3a6a435efd8ce8bd0b1dabd0c3c5f0636214322226bf19: Status 404 returned error can't find the container with id 937c6c449a7b90f48c3a6a435efd8ce8bd0b1dabd0c3c5f0636214322226bf19 Mar 13 09:16:09 crc kubenswrapper[4841]: W0313 09:16:09.809066 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6119bb42_3ca0_46bf_a9f3_e270897e718b.slice/crio-8017a8f91b97c7a637fa398d5a06c3f2adc0ca85f6bc1752fdbf14f9d969d6ef WatchSource:0}: Error finding container 8017a8f91b97c7a637fa398d5a06c3f2adc0ca85f6bc1752fdbf14f9d969d6ef: Status 404 returned error can't find the container with id 8017a8f91b97c7a637fa398d5a06c3f2adc0ca85f6bc1752fdbf14f9d969d6ef Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.810003 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.902203 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.902322 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1745fdf3-fbb9-4736-a4da-b534d8c208bd-secret-volume\") pod \"1745fdf3-fbb9-4736-a4da-b534d8c208bd\" (UID: \"1745fdf3-fbb9-4736-a4da-b534d8c208bd\") " Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.902385 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1745fdf3-fbb9-4736-a4da-b534d8c208bd-config-volume\") pod \"1745fdf3-fbb9-4736-a4da-b534d8c208bd\" (UID: \"1745fdf3-fbb9-4736-a4da-b534d8c208bd\") " Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.902485 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvvb9\" (UniqueName: \"kubernetes.io/projected/1745fdf3-fbb9-4736-a4da-b534d8c208bd-kube-api-access-fvvb9\") pod \"1745fdf3-fbb9-4736-a4da-b534d8c208bd\" (UID: \"1745fdf3-fbb9-4736-a4da-b534d8c208bd\") " Mar 13 09:16:09 crc kubenswrapper[4841]: E0313 09:16:09.903067 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:10.403000485 +0000 UTC m=+253.132900676 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.905363 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1745fdf3-fbb9-4736-a4da-b534d8c208bd-config-volume" (OuterVolumeSpecName: "config-volume") pod "1745fdf3-fbb9-4736-a4da-b534d8c208bd" (UID: "1745fdf3-fbb9-4736-a4da-b534d8c208bd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.909613 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1745fdf3-fbb9-4736-a4da-b534d8c208bd-kube-api-access-fvvb9" (OuterVolumeSpecName: "kube-api-access-fvvb9") pod "1745fdf3-fbb9-4736-a4da-b534d8c208bd" (UID: "1745fdf3-fbb9-4736-a4da-b534d8c208bd"). InnerVolumeSpecName "kube-api-access-fvvb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:16:09 crc kubenswrapper[4841]: I0313 09:16:09.912525 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1745fdf3-fbb9-4736-a4da-b534d8c208bd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1745fdf3-fbb9-4736-a4da-b534d8c208bd" (UID: "1745fdf3-fbb9-4736-a4da-b534d8c208bd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.004179 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.004622 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1745fdf3-fbb9-4736-a4da-b534d8c208bd-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.004634 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1745fdf3-fbb9-4736-a4da-b534d8c208bd-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.004643 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvvb9\" (UniqueName: \"kubernetes.io/projected/1745fdf3-fbb9-4736-a4da-b534d8c208bd-kube-api-access-fvvb9\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:10 crc kubenswrapper[4841]: E0313 09:16:10.004738 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:10.504718434 +0000 UTC m=+253.234618665 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.091433 4841 ???:1] "http: TLS handshake error from 192.168.126.11:45756: no serving certificate available for the kubelet" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.105841 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:10 crc kubenswrapper[4841]: E0313 09:16:10.106201 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 09:16:10.606186486 +0000 UTC m=+253.336086667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.207136 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:10 crc kubenswrapper[4841]: E0313 09:16:10.207648 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 09:16:10.707631668 +0000 UTC m=+253.437531859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mpnlt" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.211580 4841 patch_prober.go:28] interesting pod/router-default-5444994796-8kvzk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 09:16:10 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Mar 13 09:16:10 crc kubenswrapper[4841]: [+]process-running ok Mar 13 09:16:10 crc kubenswrapper[4841]: healthz check failed Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.211625 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8kvzk" podUID="8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.224893 4841 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-13T09:16:09.281630388Z","Handler":null,"Name":""} Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.231175 4841 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.231212 4841 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.308874 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.314962 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.357482 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 09:16:10 crc kubenswrapper[4841]: E0313 09:16:10.358519 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1745fdf3-fbb9-4736-a4da-b534d8c208bd" containerName="collect-profiles" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.358562 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1745fdf3-fbb9-4736-a4da-b534d8c208bd" containerName="collect-profiles" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.358691 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1745fdf3-fbb9-4736-a4da-b534d8c208bd" containerName="collect-profiles" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.361107 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.363249 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.363415 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.364093 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.379848 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4" event={"ID":"1745fdf3-fbb9-4736-a4da-b534d8c208bd","Type":"ContainerDied","Data":"722d1bd2bd758e29410698d60bad89f016b0e2587f1aaf15b9d0b500655e1d5b"} Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.379888 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="722d1bd2bd758e29410698d60bad89f016b0e2587f1aaf15b9d0b500655e1d5b" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.379962 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.381853 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6119bb42-3ca0-46bf-a9f3-e270897e718b","Type":"ContainerStarted","Data":"a9b633f855c7b97d251a9b339b136eb4dcee3853aaef11fcbcec94a37316a1cb"} Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.381878 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6119bb42-3ca0-46bf-a9f3-e270897e718b","Type":"ContainerStarted","Data":"8017a8f91b97c7a637fa398d5a06c3f2adc0ca85f6bc1752fdbf14f9d969d6ef"} Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.384167 4841 generic.go:334] "Generic (PLEG): container finished" podID="cbd9953a-618b-4cd2-806b-c01e07c40fc2" containerID="5b12af26a3525fd6b866a241e444437eee195694fa8814f0edfbc186b955bc96" exitCode=0 Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.384208 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbx6r" event={"ID":"cbd9953a-618b-4cd2-806b-c01e07c40fc2","Type":"ContainerDied","Data":"5b12af26a3525fd6b866a241e444437eee195694fa8814f0edfbc186b955bc96"} Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.386820 4841 generic.go:334] "Generic (PLEG): container finished" podID="81521243-f7e1-4360-9b12-8047988a69dd" containerID="b95c4404ab7c732079660543e94d5af77ba8c5a73e7403dc9c9004b0216ff4e1" exitCode=0 Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.386965 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6qx4" event={"ID":"81521243-f7e1-4360-9b12-8047988a69dd","Type":"ContainerDied","Data":"b95c4404ab7c732079660543e94d5af77ba8c5a73e7403dc9c9004b0216ff4e1"} Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.387001 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6qx4" event={"ID":"81521243-f7e1-4360-9b12-8047988a69dd","Type":"ContainerStarted","Data":"4623e2663d41561119cbb5d6845674d032df2a7ae7abc324b61b0b46cc3a946b"} Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.390221 4841 generic.go:334] "Generic (PLEG): container finished" podID="7f7766c0-82b7-4ee0-878d-47c5b0217e4d" containerID="54cd114d83c7967fd5c295d23e5631457503dc575f88907111b4150a51802749" exitCode=0 Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.390300 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlqmq" event={"ID":"7f7766c0-82b7-4ee0-878d-47c5b0217e4d","Type":"ContainerDied","Data":"54cd114d83c7967fd5c295d23e5631457503dc575f88907111b4150a51802749"} Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.390330 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlqmq" event={"ID":"7f7766c0-82b7-4ee0-878d-47c5b0217e4d","Type":"ContainerStarted","Data":"937c6c449a7b90f48c3a6a435efd8ce8bd0b1dabd0c3c5f0636214322226bf19"} Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.392843 4841 generic.go:334] "Generic (PLEG): container finished" podID="517b279c-79ee-438c-83c2-ae5cd25848fc" containerID="eec84db7439935228706c1131b512e606be6719b442de2b9123f1fa6a89bedc9" exitCode=0 Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.392904 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4lvs" event={"ID":"517b279c-79ee-438c-83c2-ae5cd25848fc","Type":"ContainerDied","Data":"eec84db7439935228706c1131b512e606be6719b442de2b9123f1fa6a89bedc9"} Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.392929 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4lvs" event={"ID":"517b279c-79ee-438c-83c2-ae5cd25848fc","Type":"ContainerStarted","Data":"e552413f54b98c9ccaed6791ea7968a8a00a409ed7be01c20fd8a562f2ed53a7"} Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.401519 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-md725" event={"ID":"363c526f-c53a-49f8-88bc-823b1ccde350","Type":"ContainerStarted","Data":"0eaa5dbd1f70b1b89267ac5296c19f21910ad655f78a3ceed8498ae0be3dfb27"} Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.411594 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.413505 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.413484173 podStartE2EDuration="1.413484173s" podCreationTimestamp="2026-03-13 09:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:10.39806252 +0000 UTC m=+253.127962711" watchObservedRunningTime="2026-03-13 09:16:10.413484173 +0000 UTC m=+253.143384364" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.428902 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.428939 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.436744 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4gtt4"] Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.438167 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4gtt4" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.440335 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.450627 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4gtt4"] Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.458873 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mpnlt\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.502286 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-md725" podStartSLOduration=11.502254388 podStartE2EDuration="11.502254388s" podCreationTimestamp="2026-03-13 09:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:10.501216065 +0000 UTC m=+253.231116266" watchObservedRunningTime="2026-03-13 09:16:10.502254388 +0000 UTC m=+253.232154579" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.516069 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/05cf1890-dd7f-4216-b6ee-caf5c79e0997-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"05cf1890-dd7f-4216-b6ee-caf5c79e0997\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.516332 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e22cd83-5a44-4048-a618-4c06f3550ace-utilities\") pod \"redhat-marketplace-4gtt4\" (UID: \"6e22cd83-5a44-4048-a618-4c06f3550ace\") " pod="openshift-marketplace/redhat-marketplace-4gtt4" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.516530 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05cf1890-dd7f-4216-b6ee-caf5c79e0997-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"05cf1890-dd7f-4216-b6ee-caf5c79e0997\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.516628 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs6tj\" (UniqueName: \"kubernetes.io/projected/6e22cd83-5a44-4048-a618-4c06f3550ace-kube-api-access-gs6tj\") pod \"redhat-marketplace-4gtt4\" (UID: \"6e22cd83-5a44-4048-a618-4c06f3550ace\") " pod="openshift-marketplace/redhat-marketplace-4gtt4" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.516934 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e22cd83-5a44-4048-a618-4c06f3550ace-catalog-content\") pod \"redhat-marketplace-4gtt4\" (UID: \"6e22cd83-5a44-4048-a618-4c06f3550ace\") " pod="openshift-marketplace/redhat-marketplace-4gtt4" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.618916 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e22cd83-5a44-4048-a618-4c06f3550ace-catalog-content\") pod \"redhat-marketplace-4gtt4\" (UID: \"6e22cd83-5a44-4048-a618-4c06f3550ace\") " pod="openshift-marketplace/redhat-marketplace-4gtt4" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.618984 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/05cf1890-dd7f-4216-b6ee-caf5c79e0997-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"05cf1890-dd7f-4216-b6ee-caf5c79e0997\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.619006 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e22cd83-5a44-4048-a618-4c06f3550ace-utilities\") pod \"redhat-marketplace-4gtt4\" (UID: \"6e22cd83-5a44-4048-a618-4c06f3550ace\") " pod="openshift-marketplace/redhat-marketplace-4gtt4" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.619051 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05cf1890-dd7f-4216-b6ee-caf5c79e0997-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"05cf1890-dd7f-4216-b6ee-caf5c79e0997\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.619072 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs6tj\" (UniqueName: \"kubernetes.io/projected/6e22cd83-5a44-4048-a618-4c06f3550ace-kube-api-access-gs6tj\") pod \"redhat-marketplace-4gtt4\" (UID: \"6e22cd83-5a44-4048-a618-4c06f3550ace\") " pod="openshift-marketplace/redhat-marketplace-4gtt4" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.619847 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e22cd83-5a44-4048-a618-4c06f3550ace-catalog-content\") pod \"redhat-marketplace-4gtt4\" (UID: \"6e22cd83-5a44-4048-a618-4c06f3550ace\") " pod="openshift-marketplace/redhat-marketplace-4gtt4" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.619910 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/05cf1890-dd7f-4216-b6ee-caf5c79e0997-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"05cf1890-dd7f-4216-b6ee-caf5c79e0997\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.620169 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e22cd83-5a44-4048-a618-4c06f3550ace-utilities\") pod \"redhat-marketplace-4gtt4\" (UID: \"6e22cd83-5a44-4048-a618-4c06f3550ace\") " pod="openshift-marketplace/redhat-marketplace-4gtt4" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.637872 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05cf1890-dd7f-4216-b6ee-caf5c79e0997-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"05cf1890-dd7f-4216-b6ee-caf5c79e0997\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.638776 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs6tj\" (UniqueName: \"kubernetes.io/projected/6e22cd83-5a44-4048-a618-4c06f3550ace-kube-api-access-gs6tj\") pod \"redhat-marketplace-4gtt4\" (UID: \"6e22cd83-5a44-4048-a618-4c06f3550ace\") " pod="openshift-marketplace/redhat-marketplace-4gtt4" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.642387 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.715782 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.775686 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4gtt4" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.813339 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qzfct"] Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.817881 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzfct" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.820332 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzfct"] Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.924781 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9de2815-2636-4e04-adf5-a7efd285781f-catalog-content\") pod \"redhat-marketplace-qzfct\" (UID: \"f9de2815-2636-4e04-adf5-a7efd285781f\") " pod="openshift-marketplace/redhat-marketplace-qzfct" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.924832 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9de2815-2636-4e04-adf5-a7efd285781f-utilities\") pod \"redhat-marketplace-qzfct\" (UID: \"f9de2815-2636-4e04-adf5-a7efd285781f\") " pod="openshift-marketplace/redhat-marketplace-qzfct" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.924858 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crwfv\" (UniqueName: \"kubernetes.io/projected/f9de2815-2636-4e04-adf5-a7efd285781f-kube-api-access-crwfv\") pod \"redhat-marketplace-qzfct\" (UID: \"f9de2815-2636-4e04-adf5-a7efd285781f\") " pod="openshift-marketplace/redhat-marketplace-qzfct" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.951969 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:16:10 crc kubenswrapper[4841]: I0313 09:16:10.956106 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-nl9wf" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.030058 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9de2815-2636-4e04-adf5-a7efd285781f-utilities\") pod \"redhat-marketplace-qzfct\" (UID: \"f9de2815-2636-4e04-adf5-a7efd285781f\") " pod="openshift-marketplace/redhat-marketplace-qzfct" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.030111 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crwfv\" (UniqueName: \"kubernetes.io/projected/f9de2815-2636-4e04-adf5-a7efd285781f-kube-api-access-crwfv\") pod \"redhat-marketplace-qzfct\" (UID: \"f9de2815-2636-4e04-adf5-a7efd285781f\") " pod="openshift-marketplace/redhat-marketplace-qzfct" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.030377 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9de2815-2636-4e04-adf5-a7efd285781f-catalog-content\") pod \"redhat-marketplace-qzfct\" (UID: \"f9de2815-2636-4e04-adf5-a7efd285781f\") " pod="openshift-marketplace/redhat-marketplace-qzfct" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.031528 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9de2815-2636-4e04-adf5-a7efd285781f-catalog-content\") pod \"redhat-marketplace-qzfct\" (UID: \"f9de2815-2636-4e04-adf5-a7efd285781f\") " pod="openshift-marketplace/redhat-marketplace-qzfct" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.031593 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9de2815-2636-4e04-adf5-a7efd285781f-utilities\") pod \"redhat-marketplace-qzfct\" (UID: \"f9de2815-2636-4e04-adf5-a7efd285781f\") " pod="openshift-marketplace/redhat-marketplace-qzfct" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.065721 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crwfv\" (UniqueName: \"kubernetes.io/projected/f9de2815-2636-4e04-adf5-a7efd285781f-kube-api-access-crwfv\") pod \"redhat-marketplace-qzfct\" (UID: \"f9de2815-2636-4e04-adf5-a7efd285781f\") " pod="openshift-marketplace/redhat-marketplace-qzfct" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.076932 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mpnlt"] Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.085085 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4gtt4"] Mar 13 09:16:11 crc kubenswrapper[4841]: W0313 09:16:11.106354 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a2c93c5_5f1b_41d3_92eb_9f91fcc15176.slice/crio-05541228cfd207e31b667146d299479dadeb0a7c642586fa78b061cf65d84e7f WatchSource:0}: Error finding container 05541228cfd207e31b667146d299479dadeb0a7c642586fa78b061cf65d84e7f: Status 404 returned error can't find the container with id 05541228cfd207e31b667146d299479dadeb0a7c642586fa78b061cf65d84e7f Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.159843 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.167867 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzfct" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.209186 4841 patch_prober.go:28] interesting pod/router-default-5444994796-8kvzk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 09:16:11 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Mar 13 09:16:11 crc kubenswrapper[4841]: [+]process-running ok Mar 13 09:16:11 crc kubenswrapper[4841]: healthz check failed Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.209231 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8kvzk" podUID="8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.411861 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4gtt4" event={"ID":"6e22cd83-5a44-4048-a618-4c06f3550ace","Type":"ContainerStarted","Data":"44c6a743851b1ecaefa580de453f322e764771723129c27b26136940e93c399b"} Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.413075 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w95m9"] Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.414394 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w95m9" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.414652 4841 generic.go:334] "Generic (PLEG): container finished" podID="6119bb42-3ca0-46bf-a9f3-e270897e718b" containerID="a9b633f855c7b97d251a9b339b136eb4dcee3853aaef11fcbcec94a37316a1cb" exitCode=0 Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.414698 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6119bb42-3ca0-46bf-a9f3-e270897e718b","Type":"ContainerDied","Data":"a9b633f855c7b97d251a9b339b136eb4dcee3853aaef11fcbcec94a37316a1cb"} Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.417562 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"05cf1890-dd7f-4216-b6ee-caf5c79e0997","Type":"ContainerStarted","Data":"7c6ae07403228484dc307e5f132bcc739fd0ca204b145fc837c69efa0df3ea6d"} Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.417721 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.419848 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" event={"ID":"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176","Type":"ContainerStarted","Data":"05541228cfd207e31b667146d299479dadeb0a7c642586fa78b061cf65d84e7f"} Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.442730 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w95m9"] Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.508007 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-lrsts container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.508060 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-lrsts" podUID="ede1a06d-6caf-41ae-acfa-2335821a2e0e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.508067 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-lrsts container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.508109 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lrsts" podUID="ede1a06d-6caf-41ae-acfa-2335821a2e0e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.541494 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67d2682a-a65f-42e2-875a-b4247bfff054-catalog-content\") pod \"redhat-operators-w95m9\" (UID: \"67d2682a-a65f-42e2-875a-b4247bfff054\") " pod="openshift-marketplace/redhat-operators-w95m9" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.541559 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67d2682a-a65f-42e2-875a-b4247bfff054-utilities\") pod \"redhat-operators-w95m9\" (UID: \"67d2682a-a65f-42e2-875a-b4247bfff054\") " pod="openshift-marketplace/redhat-operators-w95m9" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.541596 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8r79\" (UniqueName: \"kubernetes.io/projected/67d2682a-a65f-42e2-875a-b4247bfff054-kube-api-access-s8r79\") pod \"redhat-operators-w95m9\" (UID: \"67d2682a-a65f-42e2-875a-b4247bfff054\") " pod="openshift-marketplace/redhat-operators-w95m9" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.643608 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67d2682a-a65f-42e2-875a-b4247bfff054-utilities\") pod \"redhat-operators-w95m9\" (UID: \"67d2682a-a65f-42e2-875a-b4247bfff054\") " pod="openshift-marketplace/redhat-operators-w95m9" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.643861 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8r79\" (UniqueName: \"kubernetes.io/projected/67d2682a-a65f-42e2-875a-b4247bfff054-kube-api-access-s8r79\") pod \"redhat-operators-w95m9\" (UID: \"67d2682a-a65f-42e2-875a-b4247bfff054\") " pod="openshift-marketplace/redhat-operators-w95m9" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.643960 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67d2682a-a65f-42e2-875a-b4247bfff054-catalog-content\") pod \"redhat-operators-w95m9\" (UID: \"67d2682a-a65f-42e2-875a-b4247bfff054\") " pod="openshift-marketplace/redhat-operators-w95m9" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.644285 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67d2682a-a65f-42e2-875a-b4247bfff054-utilities\") pod \"redhat-operators-w95m9\" (UID: \"67d2682a-a65f-42e2-875a-b4247bfff054\") " pod="openshift-marketplace/redhat-operators-w95m9" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.644453 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67d2682a-a65f-42e2-875a-b4247bfff054-catalog-content\") pod \"redhat-operators-w95m9\" (UID: \"67d2682a-a65f-42e2-875a-b4247bfff054\") " pod="openshift-marketplace/redhat-operators-w95m9" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.662363 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8r79\" (UniqueName: \"kubernetes.io/projected/67d2682a-a65f-42e2-875a-b4247bfff054-kube-api-access-s8r79\") pod \"redhat-operators-w95m9\" (UID: \"67d2682a-a65f-42e2-875a-b4247bfff054\") " pod="openshift-marketplace/redhat-operators-w95m9" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.713197 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzfct"] Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.732761 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w95m9" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.756364 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.756400 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.768344 4841 patch_prober.go:28] interesting pod/console-f9d7485db-v6vhc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.768388 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-v6vhc" podUID="60193856-8a3f-4ce0-b79d-44e58de19b06" containerName="console" probeResult="failure" output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.811469 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h8khp"] Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.814813 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8khp" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.821128 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h8khp"] Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.880184 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.963148 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13abb343-a494-425b-9b86-ea0d41df98a5-utilities\") pod \"redhat-operators-h8khp\" (UID: \"13abb343-a494-425b-9b86-ea0d41df98a5\") " pod="openshift-marketplace/redhat-operators-h8khp" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.963198 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rf9b\" (UniqueName: \"kubernetes.io/projected/13abb343-a494-425b-9b86-ea0d41df98a5-kube-api-access-2rf9b\") pod \"redhat-operators-h8khp\" (UID: \"13abb343-a494-425b-9b86-ea0d41df98a5\") " pod="openshift-marketplace/redhat-operators-h8khp" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.963228 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13abb343-a494-425b-9b86-ea0d41df98a5-catalog-content\") pod \"redhat-operators-h8khp\" (UID: \"13abb343-a494-425b-9b86-ea0d41df98a5\") " pod="openshift-marketplace/redhat-operators-h8khp" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.963364 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.963388 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.963430 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.963466 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.970925 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.973173 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.976725 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.979231 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.989166 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.991091 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.998149 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:16:11 crc kubenswrapper[4841]: I0313 09:16:11.999294 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.020606 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.025709 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.039038 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.049811 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.064445 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13abb343-a494-425b-9b86-ea0d41df98a5-utilities\") pod \"redhat-operators-h8khp\" (UID: \"13abb343-a494-425b-9b86-ea0d41df98a5\") " pod="openshift-marketplace/redhat-operators-h8khp" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.064504 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rf9b\" (UniqueName: \"kubernetes.io/projected/13abb343-a494-425b-9b86-ea0d41df98a5-kube-api-access-2rf9b\") pod \"redhat-operators-h8khp\" (UID: \"13abb343-a494-425b-9b86-ea0d41df98a5\") " pod="openshift-marketplace/redhat-operators-h8khp" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.064538 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13abb343-a494-425b-9b86-ea0d41df98a5-catalog-content\") pod \"redhat-operators-h8khp\" (UID: \"13abb343-a494-425b-9b86-ea0d41df98a5\") " pod="openshift-marketplace/redhat-operators-h8khp" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.064989 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13abb343-a494-425b-9b86-ea0d41df98a5-catalog-content\") pod \"redhat-operators-h8khp\" (UID: \"13abb343-a494-425b-9b86-ea0d41df98a5\") " pod="openshift-marketplace/redhat-operators-h8khp" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.065025 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13abb343-a494-425b-9b86-ea0d41df98a5-utilities\") pod \"redhat-operators-h8khp\" (UID: \"13abb343-a494-425b-9b86-ea0d41df98a5\") " pod="openshift-marketplace/redhat-operators-h8khp" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.093225 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rf9b\" (UniqueName: \"kubernetes.io/projected/13abb343-a494-425b-9b86-ea0d41df98a5-kube-api-access-2rf9b\") pod \"redhat-operators-h8khp\" (UID: \"13abb343-a494-425b-9b86-ea0d41df98a5\") " pod="openshift-marketplace/redhat-operators-h8khp" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.165584 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs\") pod \"network-metrics-daemon-5t7sg\" (UID: \"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\") " pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.167648 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.176003 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8khp" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.179514 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea07a392-2a1f-4bae-bb67-db7cd421c1e1-metrics-certs\") pod \"network-metrics-daemon-5t7sg\" (UID: \"ea07a392-2a1f-4bae-bb67-db7cd421c1e1\") " pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.207381 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.212101 4841 patch_prober.go:28] interesting pod/router-default-5444994796-8kvzk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 09:16:12 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Mar 13 09:16:12 crc kubenswrapper[4841]: [+]process-running ok Mar 13 09:16:12 crc kubenswrapper[4841]: healthz check failed Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.212139 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8kvzk" podUID="8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.414218 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.421902 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5t7sg" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.443317 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" event={"ID":"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176","Type":"ContainerStarted","Data":"c4dbedede92f74070b178c3e976c30521923b17f704e27fdb772fd4ca321008f"} Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.444091 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.461803 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" podStartSLOduration=185.461788452 podStartE2EDuration="3m5.461788452s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:12.458041664 +0000 UTC m=+255.187941845" watchObservedRunningTime="2026-03-13 09:16:12.461788452 +0000 UTC m=+255.191688643" Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.472725 4841 generic.go:334] "Generic (PLEG): container finished" podID="6e22cd83-5a44-4048-a618-4c06f3550ace" containerID="2da51e28574ea04f227eafd23757b04738a3d44b6b2b97f090a3f0044b43f499" exitCode=0 Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.472791 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4gtt4" event={"ID":"6e22cd83-5a44-4048-a618-4c06f3550ace","Type":"ContainerDied","Data":"2da51e28574ea04f227eafd23757b04738a3d44b6b2b97f090a3f0044b43f499"} Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.487069 4841 generic.go:334] "Generic (PLEG): container finished" podID="05cf1890-dd7f-4216-b6ee-caf5c79e0997" containerID="68f4d6e847a183c8ea6135d4a9dc3d3efb32a0342721e9a4b450d859c1f452d6" exitCode=0 Mar 13 09:16:12 crc kubenswrapper[4841]: I0313 09:16:12.487353 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"05cf1890-dd7f-4216-b6ee-caf5c79e0997","Type":"ContainerDied","Data":"68f4d6e847a183c8ea6135d4a9dc3d3efb32a0342721e9a4b450d859c1f452d6"} Mar 13 09:16:13 crc kubenswrapper[4841]: I0313 09:16:13.208821 4841 patch_prober.go:28] interesting pod/router-default-5444994796-8kvzk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 09:16:13 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Mar 13 09:16:13 crc kubenswrapper[4841]: [+]process-running ok Mar 13 09:16:13 crc kubenswrapper[4841]: healthz check failed Mar 13 09:16:13 crc kubenswrapper[4841]: I0313 09:16:13.208870 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8kvzk" podUID="8ec2bfef-c455-40b0-92a3-43f9fcd2bd8f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 09:16:14 crc kubenswrapper[4841]: I0313 09:16:14.236362 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:14 crc kubenswrapper[4841]: I0313 09:16:14.248802 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-8kvzk" Mar 13 09:16:14 crc kubenswrapper[4841]: I0313 09:16:14.756667 4841 ???:1] "http: TLS handshake error from 192.168.126.11:45768: no serving certificate available for the kubelet" Mar 13 09:16:15 crc kubenswrapper[4841]: I0313 09:16:15.230386 4841 ???:1] "http: TLS handshake error from 192.168.126.11:45778: no serving certificate available for the kubelet" Mar 13 09:16:17 crc kubenswrapper[4841]: I0313 09:16:17.330625 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bbjqp" Mar 13 09:16:21 crc kubenswrapper[4841]: I0313 09:16:21.516301 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-lrsts" Mar 13 09:16:21 crc kubenswrapper[4841]: I0313 09:16:21.761717 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:16:21 crc kubenswrapper[4841]: I0313 09:16:21.765312 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:16:23 crc kubenswrapper[4841]: W0313 09:16:23.942988 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9de2815_2636_4e04_adf5_a7efd285781f.slice/crio-76ae43ebc1bcf252b67d0bc0a5d13e443c54af2a77f7838bca5d4ee9c04dd8e5 WatchSource:0}: Error finding container 76ae43ebc1bcf252b67d0bc0a5d13e443c54af2a77f7838bca5d4ee9c04dd8e5: Status 404 returned error can't find the container with id 76ae43ebc1bcf252b67d0bc0a5d13e443c54af2a77f7838bca5d4ee9c04dd8e5 Mar 13 09:16:24 crc kubenswrapper[4841]: I0313 09:16:24.006827 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 09:16:24 crc kubenswrapper[4841]: I0313 09:16:24.065258 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6119bb42-3ca0-46bf-a9f3-e270897e718b-kube-api-access\") pod \"6119bb42-3ca0-46bf-a9f3-e270897e718b\" (UID: \"6119bb42-3ca0-46bf-a9f3-e270897e718b\") " Mar 13 09:16:24 crc kubenswrapper[4841]: I0313 09:16:24.065487 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6119bb42-3ca0-46bf-a9f3-e270897e718b-kubelet-dir\") pod \"6119bb42-3ca0-46bf-a9f3-e270897e718b\" (UID: \"6119bb42-3ca0-46bf-a9f3-e270897e718b\") " Mar 13 09:16:24 crc kubenswrapper[4841]: I0313 09:16:24.065575 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6119bb42-3ca0-46bf-a9f3-e270897e718b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6119bb42-3ca0-46bf-a9f3-e270897e718b" (UID: "6119bb42-3ca0-46bf-a9f3-e270897e718b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:16:24 crc kubenswrapper[4841]: I0313 09:16:24.065798 4841 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6119bb42-3ca0-46bf-a9f3-e270897e718b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:24 crc kubenswrapper[4841]: I0313 09:16:24.075435 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6119bb42-3ca0-46bf-a9f3-e270897e718b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6119bb42-3ca0-46bf-a9f3-e270897e718b" (UID: "6119bb42-3ca0-46bf-a9f3-e270897e718b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:16:24 crc kubenswrapper[4841]: I0313 09:16:24.167479 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6119bb42-3ca0-46bf-a9f3-e270897e718b-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:24 crc kubenswrapper[4841]: I0313 09:16:24.609689 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzfct" event={"ID":"f9de2815-2636-4e04-adf5-a7efd285781f","Type":"ContainerStarted","Data":"76ae43ebc1bcf252b67d0bc0a5d13e443c54af2a77f7838bca5d4ee9c04dd8e5"} Mar 13 09:16:24 crc kubenswrapper[4841]: I0313 09:16:24.610915 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6119bb42-3ca0-46bf-a9f3-e270897e718b","Type":"ContainerDied","Data":"8017a8f91b97c7a637fa398d5a06c3f2adc0ca85f6bc1752fdbf14f9d969d6ef"} Mar 13 09:16:24 crc kubenswrapper[4841]: I0313 09:16:24.610940 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8017a8f91b97c7a637fa398d5a06c3f2adc0ca85f6bc1752fdbf14f9d969d6ef" Mar 13 09:16:24 crc kubenswrapper[4841]: I0313 09:16:24.611013 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 09:16:24 crc kubenswrapper[4841]: I0313 09:16:24.683190 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-845b9c77f6-lw7jf"] Mar 13 09:16:24 crc kubenswrapper[4841]: I0313 09:16:24.683389 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" podUID="55eac427-cd92-4af6-bb8d-be8a54bbb69f" containerName="controller-manager" containerID="cri-o://caf341180a02d40996f5d28f986b0973d1f01ccf867cd341125b8c548b24f718" gracePeriod=30 Mar 13 09:16:24 crc kubenswrapper[4841]: I0313 09:16:24.702644 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f"] Mar 13 09:16:24 crc kubenswrapper[4841]: I0313 09:16:24.703354 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" podUID="78895b4a-b94c-4138-a2eb-14066f4c08e2" containerName="route-controller-manager" containerID="cri-o://0918dbc19a22c179e6c0b3f7ea75596c2206c255d9ed43bc42418794d35b53c6" gracePeriod=30 Mar 13 09:16:24 crc kubenswrapper[4841]: E0313 09:16:24.897236 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 13 09:16:24 crc kubenswrapper[4841]: E0313 09:16:24.897652 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 09:16:24 crc kubenswrapper[4841]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 13 09:16:24 crc kubenswrapper[4841]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wpr2t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29556556-c9nft_openshift-infra(f5961bba-4ec3-4b4e-b5a2-73aa1024326e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 13 09:16:24 crc kubenswrapper[4841]: > logger="UnhandledError" Mar 13 09:16:24 crc kubenswrapper[4841]: E0313 09:16:24.899655 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29556556-c9nft" podUID="f5961bba-4ec3-4b4e-b5a2-73aa1024326e" Mar 13 09:16:25 crc kubenswrapper[4841]: E0313 09:16:25.123497 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 13 09:16:25 crc kubenswrapper[4841]: E0313 09:16:25.123609 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 09:16:25 crc kubenswrapper[4841]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 13 09:16:25 crc kubenswrapper[4841]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q9wrv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29556554-2dgrd_openshift-infra(344d8ef8-1693-4dcc-b09d-dc9fc7c041cc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 13 09:16:25 crc kubenswrapper[4841]: > logger="UnhandledError" Mar 13 09:16:25 crc kubenswrapper[4841]: E0313 09:16:25.124728 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29556554-2dgrd" podUID="344d8ef8-1693-4dcc-b09d-dc9fc7c041cc" Mar 13 09:16:25 crc kubenswrapper[4841]: I0313 09:16:25.171649 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 09:16:25 crc kubenswrapper[4841]: I0313 09:16:25.281708 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/05cf1890-dd7f-4216-b6ee-caf5c79e0997-kubelet-dir\") pod \"05cf1890-dd7f-4216-b6ee-caf5c79e0997\" (UID: \"05cf1890-dd7f-4216-b6ee-caf5c79e0997\") " Mar 13 09:16:25 crc kubenswrapper[4841]: I0313 09:16:25.282238 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05cf1890-dd7f-4216-b6ee-caf5c79e0997-kube-api-access\") pod \"05cf1890-dd7f-4216-b6ee-caf5c79e0997\" (UID: \"05cf1890-dd7f-4216-b6ee-caf5c79e0997\") " Mar 13 09:16:25 crc kubenswrapper[4841]: I0313 09:16:25.281848 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05cf1890-dd7f-4216-b6ee-caf5c79e0997-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "05cf1890-dd7f-4216-b6ee-caf5c79e0997" (UID: "05cf1890-dd7f-4216-b6ee-caf5c79e0997"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:16:25 crc kubenswrapper[4841]: I0313 09:16:25.282525 4841 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/05cf1890-dd7f-4216-b6ee-caf5c79e0997-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:25 crc kubenswrapper[4841]: I0313 09:16:25.287590 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05cf1890-dd7f-4216-b6ee-caf5c79e0997-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "05cf1890-dd7f-4216-b6ee-caf5c79e0997" (UID: "05cf1890-dd7f-4216-b6ee-caf5c79e0997"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:16:25 crc kubenswrapper[4841]: I0313 09:16:25.383609 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05cf1890-dd7f-4216-b6ee-caf5c79e0997-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:25 crc kubenswrapper[4841]: I0313 09:16:25.626989 4841 generic.go:334] "Generic (PLEG): container finished" podID="78895b4a-b94c-4138-a2eb-14066f4c08e2" containerID="0918dbc19a22c179e6c0b3f7ea75596c2206c255d9ed43bc42418794d35b53c6" exitCode=0 Mar 13 09:16:25 crc kubenswrapper[4841]: I0313 09:16:25.627089 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" event={"ID":"78895b4a-b94c-4138-a2eb-14066f4c08e2","Type":"ContainerDied","Data":"0918dbc19a22c179e6c0b3f7ea75596c2206c255d9ed43bc42418794d35b53c6"} Mar 13 09:16:25 crc kubenswrapper[4841]: I0313 09:16:25.629318 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 09:16:25 crc kubenswrapper[4841]: I0313 09:16:25.629304 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"05cf1890-dd7f-4216-b6ee-caf5c79e0997","Type":"ContainerDied","Data":"7c6ae07403228484dc307e5f132bcc739fd0ca204b145fc837c69efa0df3ea6d"} Mar 13 09:16:25 crc kubenswrapper[4841]: I0313 09:16:25.629394 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c6ae07403228484dc307e5f132bcc739fd0ca204b145fc837c69efa0df3ea6d" Mar 13 09:16:25 crc kubenswrapper[4841]: I0313 09:16:25.632791 4841 generic.go:334] "Generic (PLEG): container finished" podID="55eac427-cd92-4af6-bb8d-be8a54bbb69f" containerID="caf341180a02d40996f5d28f986b0973d1f01ccf867cd341125b8c548b24f718" exitCode=0 Mar 13 09:16:25 crc kubenswrapper[4841]: I0313 09:16:25.632889 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" event={"ID":"55eac427-cd92-4af6-bb8d-be8a54bbb69f","Type":"ContainerDied","Data":"caf341180a02d40996f5d28f986b0973d1f01ccf867cd341125b8c548b24f718"} Mar 13 09:16:25 crc kubenswrapper[4841]: E0313 09:16:25.634610 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29556556-c9nft" podUID="f5961bba-4ec3-4b4e-b5a2-73aa1024326e" Mar 13 09:16:25 crc kubenswrapper[4841]: E0313 09:16:25.634907 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29556554-2dgrd" podUID="344d8ef8-1693-4dcc-b09d-dc9fc7c041cc" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.285735 4841 patch_prober.go:28] interesting pod/controller-manager-845b9c77f6-lw7jf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.286336 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" podUID="55eac427-cd92-4af6-bb8d-be8a54bbb69f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.311171 4841 patch_prober.go:28] interesting pod/route-controller-manager-78bbbbbf4b-w6p8f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.311499 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" podUID="78895b4a-b94c-4138-a2eb-14066f4c08e2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.573016 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.618937 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.628299 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55eac427-cd92-4af6-bb8d-be8a54bbb69f-config\") pod \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.628372 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78895b4a-b94c-4138-a2eb-14066f4c08e2-client-ca\") pod \"78895b4a-b94c-4138-a2eb-14066f4c08e2\" (UID: \"78895b4a-b94c-4138-a2eb-14066f4c08e2\") " Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.628407 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78895b4a-b94c-4138-a2eb-14066f4c08e2-serving-cert\") pod \"78895b4a-b94c-4138-a2eb-14066f4c08e2\" (UID: \"78895b4a-b94c-4138-a2eb-14066f4c08e2\") " Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.628479 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78895b4a-b94c-4138-a2eb-14066f4c08e2-config\") pod \"78895b4a-b94c-4138-a2eb-14066f4c08e2\" (UID: \"78895b4a-b94c-4138-a2eb-14066f4c08e2\") " Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.628554 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl9p5\" (UniqueName: \"kubernetes.io/projected/55eac427-cd92-4af6-bb8d-be8a54bbb69f-kube-api-access-fl9p5\") pod \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.628640 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55eac427-cd92-4af6-bb8d-be8a54bbb69f-serving-cert\") pod \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.628667 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55eac427-cd92-4af6-bb8d-be8a54bbb69f-client-ca\") pod \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.628721 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz2mr\" (UniqueName: \"kubernetes.io/projected/78895b4a-b94c-4138-a2eb-14066f4c08e2-kube-api-access-xz2mr\") pod \"78895b4a-b94c-4138-a2eb-14066f4c08e2\" (UID: \"78895b4a-b94c-4138-a2eb-14066f4c08e2\") " Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.628744 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/55eac427-cd92-4af6-bb8d-be8a54bbb69f-proxy-ca-bundles\") pod \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\" (UID: \"55eac427-cd92-4af6-bb8d-be8a54bbb69f\") " Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.630087 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78895b4a-b94c-4138-a2eb-14066f4c08e2-client-ca" (OuterVolumeSpecName: "client-ca") pod "78895b4a-b94c-4138-a2eb-14066f4c08e2" (UID: "78895b4a-b94c-4138-a2eb-14066f4c08e2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.630111 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55eac427-cd92-4af6-bb8d-be8a54bbb69f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "55eac427-cd92-4af6-bb8d-be8a54bbb69f" (UID: "55eac427-cd92-4af6-bb8d-be8a54bbb69f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.630145 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55eac427-cd92-4af6-bb8d-be8a54bbb69f-client-ca" (OuterVolumeSpecName: "client-ca") pod "55eac427-cd92-4af6-bb8d-be8a54bbb69f" (UID: "55eac427-cd92-4af6-bb8d-be8a54bbb69f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.630334 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78895b4a-b94c-4138-a2eb-14066f4c08e2-config" (OuterVolumeSpecName: "config") pod "78895b4a-b94c-4138-a2eb-14066f4c08e2" (UID: "78895b4a-b94c-4138-a2eb-14066f4c08e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.630393 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55eac427-cd92-4af6-bb8d-be8a54bbb69f-config" (OuterVolumeSpecName: "config") pod "55eac427-cd92-4af6-bb8d-be8a54bbb69f" (UID: "55eac427-cd92-4af6-bb8d-be8a54bbb69f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.635394 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78895b4a-b94c-4138-a2eb-14066f4c08e2-kube-api-access-xz2mr" (OuterVolumeSpecName: "kube-api-access-xz2mr") pod "78895b4a-b94c-4138-a2eb-14066f4c08e2" (UID: "78895b4a-b94c-4138-a2eb-14066f4c08e2"). InnerVolumeSpecName "kube-api-access-xz2mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.635618 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55eac427-cd92-4af6-bb8d-be8a54bbb69f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "55eac427-cd92-4af6-bb8d-be8a54bbb69f" (UID: "55eac427-cd92-4af6-bb8d-be8a54bbb69f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.636463 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4"] Mar 13 09:16:28 crc kubenswrapper[4841]: E0313 09:16:28.636744 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05cf1890-dd7f-4216-b6ee-caf5c79e0997" containerName="pruner" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.636761 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="05cf1890-dd7f-4216-b6ee-caf5c79e0997" containerName="pruner" Mar 13 09:16:28 crc kubenswrapper[4841]: E0313 09:16:28.636774 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78895b4a-b94c-4138-a2eb-14066f4c08e2" containerName="route-controller-manager" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.636781 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="78895b4a-b94c-4138-a2eb-14066f4c08e2" containerName="route-controller-manager" Mar 13 09:16:28 crc kubenswrapper[4841]: E0313 09:16:28.636794 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55eac427-cd92-4af6-bb8d-be8a54bbb69f" containerName="controller-manager" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.636802 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="55eac427-cd92-4af6-bb8d-be8a54bbb69f" containerName="controller-manager" Mar 13 09:16:28 crc kubenswrapper[4841]: E0313 09:16:28.636809 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6119bb42-3ca0-46bf-a9f3-e270897e718b" containerName="pruner" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.636816 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6119bb42-3ca0-46bf-a9f3-e270897e718b" containerName="pruner" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.636954 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="78895b4a-b94c-4138-a2eb-14066f4c08e2" containerName="route-controller-manager" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.636973 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="6119bb42-3ca0-46bf-a9f3-e270897e718b" containerName="pruner" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.636984 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="55eac427-cd92-4af6-bb8d-be8a54bbb69f" containerName="controller-manager" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.636994 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="05cf1890-dd7f-4216-b6ee-caf5c79e0997" containerName="pruner" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.637804 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.638820 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55eac427-cd92-4af6-bb8d-be8a54bbb69f-kube-api-access-fl9p5" (OuterVolumeSpecName: "kube-api-access-fl9p5") pod "55eac427-cd92-4af6-bb8d-be8a54bbb69f" (UID: "55eac427-cd92-4af6-bb8d-be8a54bbb69f"). InnerVolumeSpecName "kube-api-access-fl9p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.646345 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4"] Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.649849 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78895b4a-b94c-4138-a2eb-14066f4c08e2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "78895b4a-b94c-4138-a2eb-14066f4c08e2" (UID: "78895b4a-b94c-4138-a2eb-14066f4c08e2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.672863 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" event={"ID":"55eac427-cd92-4af6-bb8d-be8a54bbb69f","Type":"ContainerDied","Data":"41a3577e44e39d21dae34ca9a6767209c8482f7af2f789f73e4d84d4692d15cf"} Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.672922 4841 scope.go:117] "RemoveContainer" containerID="caf341180a02d40996f5d28f986b0973d1f01ccf867cd341125b8c548b24f718" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.673078 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-845b9c77f6-lw7jf" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.685517 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" event={"ID":"78895b4a-b94c-4138-a2eb-14066f4c08e2","Type":"ContainerDied","Data":"6a3c0e93065725f8c8b3dec8b09a178c0ad9777130eae301a40c21fb6912d4df"} Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.685642 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.717708 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-845b9c77f6-lw7jf"] Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.730739 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83e95392-2d00-4cf2-a34d-7a93508e9e84-client-ca\") pod \"route-controller-manager-79c9b46b5-dzkt4\" (UID: \"83e95392-2d00-4cf2-a34d-7a93508e9e84\") " pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.730782 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e95392-2d00-4cf2-a34d-7a93508e9e84-config\") pod \"route-controller-manager-79c9b46b5-dzkt4\" (UID: \"83e95392-2d00-4cf2-a34d-7a93508e9e84\") " pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.730805 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e95392-2d00-4cf2-a34d-7a93508e9e84-serving-cert\") pod \"route-controller-manager-79c9b46b5-dzkt4\" (UID: \"83e95392-2d00-4cf2-a34d-7a93508e9e84\") " pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.730827 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbksp\" (UniqueName: \"kubernetes.io/projected/83e95392-2d00-4cf2-a34d-7a93508e9e84-kube-api-access-jbksp\") pod \"route-controller-manager-79c9b46b5-dzkt4\" (UID: \"83e95392-2d00-4cf2-a34d-7a93508e9e84\") " pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.730864 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55eac427-cd92-4af6-bb8d-be8a54bbb69f-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.730874 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78895b4a-b94c-4138-a2eb-14066f4c08e2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.730883 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78895b4a-b94c-4138-a2eb-14066f4c08e2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.730891 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78895b4a-b94c-4138-a2eb-14066f4c08e2-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.730899 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl9p5\" (UniqueName: \"kubernetes.io/projected/55eac427-cd92-4af6-bb8d-be8a54bbb69f-kube-api-access-fl9p5\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.730907 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55eac427-cd92-4af6-bb8d-be8a54bbb69f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.730914 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55eac427-cd92-4af6-bb8d-be8a54bbb69f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.730922 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz2mr\" (UniqueName: \"kubernetes.io/projected/78895b4a-b94c-4138-a2eb-14066f4c08e2-kube-api-access-xz2mr\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.730930 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/55eac427-cd92-4af6-bb8d-be8a54bbb69f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.733721 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-845b9c77f6-lw7jf"] Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.742661 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f"] Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.746308 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78bbbbbf4b-w6p8f"] Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.773798 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h8khp"] Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.831729 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83e95392-2d00-4cf2-a34d-7a93508e9e84-client-ca\") pod \"route-controller-manager-79c9b46b5-dzkt4\" (UID: \"83e95392-2d00-4cf2-a34d-7a93508e9e84\") " pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.831780 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e95392-2d00-4cf2-a34d-7a93508e9e84-config\") pod \"route-controller-manager-79c9b46b5-dzkt4\" (UID: \"83e95392-2d00-4cf2-a34d-7a93508e9e84\") " pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.831802 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e95392-2d00-4cf2-a34d-7a93508e9e84-serving-cert\") pod \"route-controller-manager-79c9b46b5-dzkt4\" (UID: \"83e95392-2d00-4cf2-a34d-7a93508e9e84\") " pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.831825 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbksp\" (UniqueName: \"kubernetes.io/projected/83e95392-2d00-4cf2-a34d-7a93508e9e84-kube-api-access-jbksp\") pod \"route-controller-manager-79c9b46b5-dzkt4\" (UID: \"83e95392-2d00-4cf2-a34d-7a93508e9e84\") " pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.833376 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83e95392-2d00-4cf2-a34d-7a93508e9e84-client-ca\") pod \"route-controller-manager-79c9b46b5-dzkt4\" (UID: \"83e95392-2d00-4cf2-a34d-7a93508e9e84\") " pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.833865 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e95392-2d00-4cf2-a34d-7a93508e9e84-config\") pod \"route-controller-manager-79c9b46b5-dzkt4\" (UID: \"83e95392-2d00-4cf2-a34d-7a93508e9e84\") " pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.835341 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e95392-2d00-4cf2-a34d-7a93508e9e84-serving-cert\") pod \"route-controller-manager-79c9b46b5-dzkt4\" (UID: \"83e95392-2d00-4cf2-a34d-7a93508e9e84\") " pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.848163 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbksp\" (UniqueName: \"kubernetes.io/projected/83e95392-2d00-4cf2-a34d-7a93508e9e84-kube-api-access-jbksp\") pod \"route-controller-manager-79c9b46b5-dzkt4\" (UID: \"83e95392-2d00-4cf2-a34d-7a93508e9e84\") " pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" Mar 13 09:16:28 crc kubenswrapper[4841]: I0313 09:16:28.999337 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" Mar 13 09:16:30 crc kubenswrapper[4841]: I0313 09:16:30.001401 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55eac427-cd92-4af6-bb8d-be8a54bbb69f" path="/var/lib/kubelet/pods/55eac427-cd92-4af6-bb8d-be8a54bbb69f/volumes" Mar 13 09:16:30 crc kubenswrapper[4841]: I0313 09:16:30.002850 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78895b4a-b94c-4138-a2eb-14066f4c08e2" path="/var/lib/kubelet/pods/78895b4a-b94c-4138-a2eb-14066f4c08e2/volumes" Mar 13 09:16:30 crc kubenswrapper[4841]: I0313 09:16:30.648175 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:16:32 crc kubenswrapper[4841]: I0313 09:16:32.218914 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5t7sg"] Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.408635 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4"] Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.409295 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.413215 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.413617 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.413728 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.413884 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.414143 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.415369 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.418930 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4"] Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.425163 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.511245 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a05c09f-d210-4def-9688-ab16f9b7c57b-proxy-ca-bundles\") pod \"controller-manager-6bcc56d7fb-98gd4\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.511364 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a05c09f-d210-4def-9688-ab16f9b7c57b-serving-cert\") pod \"controller-manager-6bcc56d7fb-98gd4\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.511402 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a05c09f-d210-4def-9688-ab16f9b7c57b-client-ca\") pod \"controller-manager-6bcc56d7fb-98gd4\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.511435 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a05c09f-d210-4def-9688-ab16f9b7c57b-config\") pod \"controller-manager-6bcc56d7fb-98gd4\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.511451 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r82gk\" (UniqueName: \"kubernetes.io/projected/4a05c09f-d210-4def-9688-ab16f9b7c57b-kube-api-access-r82gk\") pod \"controller-manager-6bcc56d7fb-98gd4\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.612160 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a05c09f-d210-4def-9688-ab16f9b7c57b-proxy-ca-bundles\") pod \"controller-manager-6bcc56d7fb-98gd4\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.612228 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a05c09f-d210-4def-9688-ab16f9b7c57b-serving-cert\") pod \"controller-manager-6bcc56d7fb-98gd4\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.612243 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a05c09f-d210-4def-9688-ab16f9b7c57b-client-ca\") pod \"controller-manager-6bcc56d7fb-98gd4\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.612298 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a05c09f-d210-4def-9688-ab16f9b7c57b-config\") pod \"controller-manager-6bcc56d7fb-98gd4\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.612324 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r82gk\" (UniqueName: \"kubernetes.io/projected/4a05c09f-d210-4def-9688-ab16f9b7c57b-kube-api-access-r82gk\") pod \"controller-manager-6bcc56d7fb-98gd4\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.614303 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a05c09f-d210-4def-9688-ab16f9b7c57b-proxy-ca-bundles\") pod \"controller-manager-6bcc56d7fb-98gd4\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.616582 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a05c09f-d210-4def-9688-ab16f9b7c57b-client-ca\") pod \"controller-manager-6bcc56d7fb-98gd4\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.616619 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a05c09f-d210-4def-9688-ab16f9b7c57b-config\") pod \"controller-manager-6bcc56d7fb-98gd4\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.620167 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a05c09f-d210-4def-9688-ab16f9b7c57b-serving-cert\") pod \"controller-manager-6bcc56d7fb-98gd4\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.627687 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r82gk\" (UniqueName: \"kubernetes.io/projected/4a05c09f-d210-4def-9688-ab16f9b7c57b-kube-api-access-r82gk\") pod \"controller-manager-6bcc56d7fb-98gd4\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.713826 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8khp" event={"ID":"13abb343-a494-425b-9b86-ea0d41df98a5","Type":"ContainerStarted","Data":"a12f95e032e28ed064f789fdf64b382d0d48c8b49e324b60465ab4ce293c4a64"} Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.714914 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4aed736e91a1ebe4d4816b06663fc78796fe32ae518686abcca1aa158f7a36cb"} Mar 13 09:16:33 crc kubenswrapper[4841]: W0313 09:16:33.716159 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-55be68c72580fbb7f893b1785b8bb062c600c9c696d669caae7cc30ccf5d31cb WatchSource:0}: Error finding container 55be68c72580fbb7f893b1785b8bb062c600c9c696d669caae7cc30ccf5d31cb: Status 404 returned error can't find the container with id 55be68c72580fbb7f893b1785b8bb062c600c9c696d669caae7cc30ccf5d31cb Mar 13 09:16:33 crc kubenswrapper[4841]: W0313 09:16:33.721109 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea07a392_2a1f_4bae_bb67_db7cd421c1e1.slice/crio-3982093fe178e32de58c02174f19cdf68bc3e2b3dfd66c43a911bab40434b13f WatchSource:0}: Error finding container 3982093fe178e32de58c02174f19cdf68bc3e2b3dfd66c43a911bab40434b13f: Status 404 returned error can't find the container with id 3982093fe178e32de58c02174f19cdf68bc3e2b3dfd66c43a911bab40434b13f Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.734004 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:33 crc kubenswrapper[4841]: I0313 09:16:33.738592 4841 scope.go:117] "RemoveContainer" containerID="0918dbc19a22c179e6c0b3f7ea75596c2206c255d9ed43bc42418794d35b53c6" Mar 13 09:16:33 crc kubenswrapper[4841]: W0313 09:16:33.819555 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-a8f8841454df13d5e1558fe4cd19f54d0b24740bbf6fcef53ed07afb73325ed6 WatchSource:0}: Error finding container a8f8841454df13d5e1558fe4cd19f54d0b24740bbf6fcef53ed07afb73325ed6: Status 404 returned error can't find the container with id a8f8841454df13d5e1558fe4cd19f54d0b24740bbf6fcef53ed07afb73325ed6 Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.052553 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w95m9"] Mar 13 09:16:34 crc kubenswrapper[4841]: W0313 09:16:34.071949 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67d2682a_a65f_42e2_875a_b4247bfff054.slice/crio-11acf8002510b51831b1226d859b327a6f9e09fc2ece211904045a6ef45c315d WatchSource:0}: Error finding container 11acf8002510b51831b1226d859b327a6f9e09fc2ece211904045a6ef45c315d: Status 404 returned error can't find the container with id 11acf8002510b51831b1226d859b327a6f9e09fc2ece211904045a6ef45c315d Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.206077 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4"] Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.328459 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4"] Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.407551 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.407609 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.722109 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d03eda9ef67b5c56e257ddd06db36f4c714538b3f1180e302f8a6b40a9ce3d78"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.724468 4841 generic.go:334] "Generic (PLEG): container finished" podID="7f7766c0-82b7-4ee0-878d-47c5b0217e4d" containerID="fcdcf258036518ea9ef629f065db21629d1db17b0fbaf10efa2bc26f7636daca" exitCode=0 Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.724563 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlqmq" event={"ID":"7f7766c0-82b7-4ee0-878d-47c5b0217e4d","Type":"ContainerDied","Data":"fcdcf258036518ea9ef629f065db21629d1db17b0fbaf10efa2bc26f7636daca"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.726133 4841 generic.go:334] "Generic (PLEG): container finished" podID="517b279c-79ee-438c-83c2-ae5cd25848fc" containerID="b03afa1fb026276410b7093396e3630f13d294be9d63d4c49dc7465f30a2f95a" exitCode=0 Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.726179 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4lvs" event={"ID":"517b279c-79ee-438c-83c2-ae5cd25848fc","Type":"ContainerDied","Data":"b03afa1fb026276410b7093396e3630f13d294be9d63d4c49dc7465f30a2f95a"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.730302 4841 generic.go:334] "Generic (PLEG): container finished" podID="13abb343-a494-425b-9b86-ea0d41df98a5" containerID="71692dbaa05668837496c12a68716a87feb2c96fc6d5009cfb7e34b6631db64a" exitCode=0 Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.730365 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8khp" event={"ID":"13abb343-a494-425b-9b86-ea0d41df98a5","Type":"ContainerDied","Data":"71692dbaa05668837496c12a68716a87feb2c96fc6d5009cfb7e34b6631db64a"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.731951 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" event={"ID":"ea07a392-2a1f-4bae-bb67-db7cd421c1e1","Type":"ContainerStarted","Data":"be4aae8105f9ed67abd3f68c47c72f068c335ced623d5e2856e2833788196bda"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.731983 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" event={"ID":"ea07a392-2a1f-4bae-bb67-db7cd421c1e1","Type":"ContainerStarted","Data":"3982093fe178e32de58c02174f19cdf68bc3e2b3dfd66c43a911bab40434b13f"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.733224 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" event={"ID":"4a05c09f-d210-4def-9688-ab16f9b7c57b","Type":"ContainerStarted","Data":"e40f5f5364e57a9a48df1c832062d0120cc27fd1c51e4fcb3eea6ec62e18ed77"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.733243 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" event={"ID":"4a05c09f-d210-4def-9688-ab16f9b7c57b","Type":"ContainerStarted","Data":"2a6ce9086bf80bbbc207a12610dbda6a0f86e2f1f129e740159a29246eb9ce16"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.733939 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.739802 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.741209 4841 generic.go:334] "Generic (PLEG): container finished" podID="cbd9953a-618b-4cd2-806b-c01e07c40fc2" containerID="578afedc43679c1cbb948b5e1ea531f5c5a160ac649e1bb7c71b933c9fb0febb" exitCode=0 Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.741289 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbx6r" event={"ID":"cbd9953a-618b-4cd2-806b-c01e07c40fc2","Type":"ContainerDied","Data":"578afedc43679c1cbb948b5e1ea531f5c5a160ac649e1bb7c71b933c9fb0febb"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.742442 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4d34f8d3f1bee5480730b56ac8b5e4b1d807cec7000bb1de9cbf53e82e9bd617"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.742471 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"55be68c72580fbb7f893b1785b8bb062c600c9c696d669caae7cc30ccf5d31cb"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.742577 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.744352 4841 generic.go:334] "Generic (PLEG): container finished" podID="f9de2815-2636-4e04-adf5-a7efd285781f" containerID="5e3fea3f90a51f679f6cf953c1f6d8dd84315c7647500d72658c40876a0d7a63" exitCode=0 Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.744501 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzfct" event={"ID":"f9de2815-2636-4e04-adf5-a7efd285781f","Type":"ContainerDied","Data":"5e3fea3f90a51f679f6cf953c1f6d8dd84315c7647500d72658c40876a0d7a63"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.747687 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" event={"ID":"83e95392-2d00-4cf2-a34d-7a93508e9e84","Type":"ContainerStarted","Data":"68fd0af67b1ac9dad337aa50367e6ec87cfda36cf7609ee36316248b6637ce0c"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.747730 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" event={"ID":"83e95392-2d00-4cf2-a34d-7a93508e9e84","Type":"ContainerStarted","Data":"ff0f932267bc7c8642868a15e2631bb777fdd1130670871b59210cf1645c8582"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.747749 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.755049 4841 generic.go:334] "Generic (PLEG): container finished" podID="81521243-f7e1-4360-9b12-8047988a69dd" containerID="9ed338d23007d7b03a788eedfcb08c7bc25a687ef3b68048f7622d6e6576bdcb" exitCode=0 Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.755125 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6qx4" event={"ID":"81521243-f7e1-4360-9b12-8047988a69dd","Type":"ContainerDied","Data":"9ed338d23007d7b03a788eedfcb08c7bc25a687ef3b68048f7622d6e6576bdcb"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.757923 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"767e4e362ea73bef693dc4dc2ee6fd98533493785d8422feb834f270da04eb4f"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.757966 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a8f8841454df13d5e1558fe4cd19f54d0b24740bbf6fcef53ed07afb73325ed6"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.762010 4841 generic.go:334] "Generic (PLEG): container finished" podID="6e22cd83-5a44-4048-a618-4c06f3550ace" containerID="874212a9cf144263de3415092f0d925e0a8f4275de354040703271034e73b068" exitCode=0 Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.762074 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4gtt4" event={"ID":"6e22cd83-5a44-4048-a618-4c06f3550ace","Type":"ContainerDied","Data":"874212a9cf144263de3415092f0d925e0a8f4275de354040703271034e73b068"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.771851 4841 generic.go:334] "Generic (PLEG): container finished" podID="67d2682a-a65f-42e2-875a-b4247bfff054" containerID="6b81fe8aff61476cc189ebdf332e56f5d10bb7a0636ed98208d8d5e24c0c6279" exitCode=0 Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.772323 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w95m9" event={"ID":"67d2682a-a65f-42e2-875a-b4247bfff054","Type":"ContainerDied","Data":"6b81fe8aff61476cc189ebdf332e56f5d10bb7a0636ed98208d8d5e24c0c6279"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.772418 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w95m9" event={"ID":"67d2682a-a65f-42e2-875a-b4247bfff054","Type":"ContainerStarted","Data":"11acf8002510b51831b1226d859b327a6f9e09fc2ece211904045a6ef45c315d"} Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.820811 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.833506 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" podStartSLOduration=10.833488349 podStartE2EDuration="10.833488349s" podCreationTimestamp="2026-03-13 09:16:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:34.832104627 +0000 UTC m=+277.562004818" watchObservedRunningTime="2026-03-13 09:16:34.833488349 +0000 UTC m=+277.563388540" Mar 13 09:16:34 crc kubenswrapper[4841]: I0313 09:16:34.988899 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" podStartSLOduration=10.988874584 podStartE2EDuration="10.988874584s" podCreationTimestamp="2026-03-13 09:16:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:34.984911773 +0000 UTC m=+277.714811974" watchObservedRunningTime="2026-03-13 09:16:34.988874584 +0000 UTC m=+277.718774775" Mar 13 09:16:35 crc kubenswrapper[4841]: I0313 09:16:35.735146 4841 ???:1] "http: TLS handshake error from 192.168.126.11:34194: no serving certificate available for the kubelet" Mar 13 09:16:35 crc kubenswrapper[4841]: I0313 09:16:35.785460 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlqmq" event={"ID":"7f7766c0-82b7-4ee0-878d-47c5b0217e4d","Type":"ContainerStarted","Data":"8f59fcb21ea841ba8d62cc2a142bb209664c6ea93ed74da00903a982db4aab0f"} Mar 13 09:16:35 crc kubenswrapper[4841]: I0313 09:16:35.787799 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4lvs" event={"ID":"517b279c-79ee-438c-83c2-ae5cd25848fc","Type":"ContainerStarted","Data":"369db51f0191c31807cc317cb997a821a7a7319e8af980041eabc066f1c3abff"} Mar 13 09:16:35 crc kubenswrapper[4841]: I0313 09:16:35.792377 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5t7sg" event={"ID":"ea07a392-2a1f-4bae-bb67-db7cd421c1e1","Type":"ContainerStarted","Data":"ec1857164f07a6a3866eeecbc5f7f088d74af9767181108d9b06a8474c48552a"} Mar 13 09:16:35 crc kubenswrapper[4841]: I0313 09:16:35.796875 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbx6r" event={"ID":"cbd9953a-618b-4cd2-806b-c01e07c40fc2","Type":"ContainerStarted","Data":"7081422dd20bdd7244829e876963f21903d1e9db1589b8ec5a050cbea54cf15f"} Mar 13 09:16:35 crc kubenswrapper[4841]: I0313 09:16:35.800294 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6qx4" event={"ID":"81521243-f7e1-4360-9b12-8047988a69dd","Type":"ContainerStarted","Data":"f8b2d47f9b4c37d297302ce64c0f0c86db444f63756040c4f6a870113a679b28"} Mar 13 09:16:35 crc kubenswrapper[4841]: I0313 09:16:35.827818 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d4lvs" podStartSLOduration=2.863354507 podStartE2EDuration="27.827802774s" podCreationTimestamp="2026-03-13 09:16:08 +0000 UTC" firstStartedPulling="2026-03-13 09:16:10.396381307 +0000 UTC m=+253.126281498" lastFinishedPulling="2026-03-13 09:16:35.360829574 +0000 UTC m=+278.090729765" observedRunningTime="2026-03-13 09:16:35.826766253 +0000 UTC m=+278.556666444" watchObservedRunningTime="2026-03-13 09:16:35.827802774 +0000 UTC m=+278.557702965" Mar 13 09:16:35 crc kubenswrapper[4841]: I0313 09:16:35.830011 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wlqmq" podStartSLOduration=2.645316423 podStartE2EDuration="27.830003552s" podCreationTimestamp="2026-03-13 09:16:08 +0000 UTC" firstStartedPulling="2026-03-13 09:16:10.3916928 +0000 UTC m=+253.121592991" lastFinishedPulling="2026-03-13 09:16:35.576379939 +0000 UTC m=+278.306280120" observedRunningTime="2026-03-13 09:16:35.804095623 +0000 UTC m=+278.533995824" watchObservedRunningTime="2026-03-13 09:16:35.830003552 +0000 UTC m=+278.559903743" Mar 13 09:16:35 crc kubenswrapper[4841]: I0313 09:16:35.841589 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sbx6r" podStartSLOduration=2.7440424 podStartE2EDuration="27.841576543s" podCreationTimestamp="2026-03-13 09:16:08 +0000 UTC" firstStartedPulling="2026-03-13 09:16:10.385842107 +0000 UTC m=+253.115742298" lastFinishedPulling="2026-03-13 09:16:35.48337625 +0000 UTC m=+278.213276441" observedRunningTime="2026-03-13 09:16:35.839064066 +0000 UTC m=+278.568964257" watchObservedRunningTime="2026-03-13 09:16:35.841576543 +0000 UTC m=+278.571476734" Mar 13 09:16:35 crc kubenswrapper[4841]: I0313 09:16:35.853008 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5t7sg" podStartSLOduration=208.85298935 podStartE2EDuration="3m28.85298935s" podCreationTimestamp="2026-03-13 09:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:35.851539846 +0000 UTC m=+278.581440037" watchObservedRunningTime="2026-03-13 09:16:35.85298935 +0000 UTC m=+278.582889541" Mar 13 09:16:35 crc kubenswrapper[4841]: I0313 09:16:35.874405 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l6qx4" podStartSLOduration=2.873713945 podStartE2EDuration="27.874388361s" podCreationTimestamp="2026-03-13 09:16:08 +0000 UTC" firstStartedPulling="2026-03-13 09:16:10.388727757 +0000 UTC m=+253.118627948" lastFinishedPulling="2026-03-13 09:16:35.389402173 +0000 UTC m=+278.119302364" observedRunningTime="2026-03-13 09:16:35.873479313 +0000 UTC m=+278.603379514" watchObservedRunningTime="2026-03-13 09:16:35.874388361 +0000 UTC m=+278.604288552" Mar 13 09:16:36 crc kubenswrapper[4841]: I0313 09:16:36.809469 4841 generic.go:334] "Generic (PLEG): container finished" podID="f9de2815-2636-4e04-adf5-a7efd285781f" containerID="6d99c66c31c7dc7895fa48b732ef6f73a99aae4ebfd586a43e402c50cd5aca81" exitCode=0 Mar 13 09:16:36 crc kubenswrapper[4841]: I0313 09:16:36.809526 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzfct" event={"ID":"f9de2815-2636-4e04-adf5-a7efd285781f","Type":"ContainerDied","Data":"6d99c66c31c7dc7895fa48b732ef6f73a99aae4ebfd586a43e402c50cd5aca81"} Mar 13 09:16:36 crc kubenswrapper[4841]: I0313 09:16:36.815528 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4gtt4" event={"ID":"6e22cd83-5a44-4048-a618-4c06f3550ace","Type":"ContainerStarted","Data":"f01c0a6f121ad398d9cb4ed9b110cd0db457df5afe4f0bef1db46c6f8b4a1159"} Mar 13 09:16:36 crc kubenswrapper[4841]: I0313 09:16:36.850782 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4gtt4" podStartSLOduration=9.081736592 podStartE2EDuration="26.850765881s" podCreationTimestamp="2026-03-13 09:16:10 +0000 UTC" firstStartedPulling="2026-03-13 09:16:18.722033033 +0000 UTC m=+261.451933244" lastFinishedPulling="2026-03-13 09:16:36.491062342 +0000 UTC m=+279.220962533" observedRunningTime="2026-03-13 09:16:36.850476612 +0000 UTC m=+279.580376823" watchObservedRunningTime="2026-03-13 09:16:36.850765881 +0000 UTC m=+279.580666072" Mar 13 09:16:37 crc kubenswrapper[4841]: I0313 09:16:37.837342 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzfct" event={"ID":"f9de2815-2636-4e04-adf5-a7efd285781f","Type":"ContainerStarted","Data":"17a997825073552afb564eba1681dcbeed5e33a7eeb805e2e128ff05163bf97a"} Mar 13 09:16:37 crc kubenswrapper[4841]: I0313 09:16:37.857962 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qzfct" podStartSLOduration=25.364200846 podStartE2EDuration="27.857947177s" podCreationTimestamp="2026-03-13 09:16:10 +0000 UTC" firstStartedPulling="2026-03-13 09:16:34.747547615 +0000 UTC m=+277.477447806" lastFinishedPulling="2026-03-13 09:16:37.241293946 +0000 UTC m=+279.971194137" observedRunningTime="2026-03-13 09:16:37.85740577 +0000 UTC m=+280.587305961" watchObservedRunningTime="2026-03-13 09:16:37.857947177 +0000 UTC m=+280.587847368" Mar 13 09:16:38 crc kubenswrapper[4841]: I0313 09:16:38.546771 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sbx6r" Mar 13 09:16:38 crc kubenswrapper[4841]: I0313 09:16:38.547084 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sbx6r" Mar 13 09:16:38 crc kubenswrapper[4841]: I0313 09:16:38.704247 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sbx6r" Mar 13 09:16:38 crc kubenswrapper[4841]: I0313 09:16:38.737611 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l6qx4" Mar 13 09:16:38 crc kubenswrapper[4841]: I0313 09:16:38.737652 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l6qx4" Mar 13 09:16:38 crc kubenswrapper[4841]: I0313 09:16:38.774696 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l6qx4" Mar 13 09:16:38 crc kubenswrapper[4841]: I0313 09:16:38.951733 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d4lvs" Mar 13 09:16:38 crc kubenswrapper[4841]: I0313 09:16:38.951774 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d4lvs" Mar 13 09:16:39 crc kubenswrapper[4841]: I0313 09:16:39.005074 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d4lvs" Mar 13 09:16:39 crc kubenswrapper[4841]: I0313 09:16:39.239965 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wlqmq" Mar 13 09:16:39 crc kubenswrapper[4841]: I0313 09:16:39.240014 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wlqmq" Mar 13 09:16:39 crc kubenswrapper[4841]: I0313 09:16:39.295391 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wlqmq" Mar 13 09:16:39 crc kubenswrapper[4841]: I0313 09:16:39.568553 4841 csr.go:261] certificate signing request csr-hpx5m is approved, waiting to be issued Mar 13 09:16:39 crc kubenswrapper[4841]: I0313 09:16:39.578972 4841 csr.go:257] certificate signing request csr-hpx5m is issued Mar 13 09:16:39 crc kubenswrapper[4841]: I0313 09:16:39.851169 4841 generic.go:334] "Generic (PLEG): container finished" podID="344d8ef8-1693-4dcc-b09d-dc9fc7c041cc" containerID="a4578955e23a6cbfb66acd450dda9a49c3a4c8ee399f72496ce245dadaa5980d" exitCode=0 Mar 13 09:16:39 crc kubenswrapper[4841]: I0313 09:16:39.851284 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556554-2dgrd" event={"ID":"344d8ef8-1693-4dcc-b09d-dc9fc7c041cc","Type":"ContainerDied","Data":"a4578955e23a6cbfb66acd450dda9a49c3a4c8ee399f72496ce245dadaa5980d"} Mar 13 09:16:40 crc kubenswrapper[4841]: I0313 09:16:40.580576 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-05 15:57:51.835643089 +0000 UTC Mar 13 09:16:40 crc kubenswrapper[4841]: I0313 09:16:40.580661 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7158h41m11.254985578s for next certificate rotation Mar 13 09:16:40 crc kubenswrapper[4841]: I0313 09:16:40.779200 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4gtt4" Mar 13 09:16:40 crc kubenswrapper[4841]: I0313 09:16:40.779250 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4gtt4" Mar 13 09:16:40 crc kubenswrapper[4841]: I0313 09:16:40.822789 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4gtt4" Mar 13 09:16:41 crc kubenswrapper[4841]: I0313 09:16:41.168803 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qzfct" Mar 13 09:16:41 crc kubenswrapper[4841]: I0313 09:16:41.168844 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qzfct" Mar 13 09:16:41 crc kubenswrapper[4841]: I0313 09:16:41.200476 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556554-2dgrd" Mar 13 09:16:41 crc kubenswrapper[4841]: I0313 09:16:41.217694 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qzfct" Mar 13 09:16:41 crc kubenswrapper[4841]: I0313 09:16:41.228641 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9wrv\" (UniqueName: \"kubernetes.io/projected/344d8ef8-1693-4dcc-b09d-dc9fc7c041cc-kube-api-access-q9wrv\") pod \"344d8ef8-1693-4dcc-b09d-dc9fc7c041cc\" (UID: \"344d8ef8-1693-4dcc-b09d-dc9fc7c041cc\") " Mar 13 09:16:41 crc kubenswrapper[4841]: I0313 09:16:41.238100 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/344d8ef8-1693-4dcc-b09d-dc9fc7c041cc-kube-api-access-q9wrv" (OuterVolumeSpecName: "kube-api-access-q9wrv") pod "344d8ef8-1693-4dcc-b09d-dc9fc7c041cc" (UID: "344d8ef8-1693-4dcc-b09d-dc9fc7c041cc"). InnerVolumeSpecName "kube-api-access-q9wrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:16:41 crc kubenswrapper[4841]: I0313 09:16:41.330360 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9wrv\" (UniqueName: \"kubernetes.io/projected/344d8ef8-1693-4dcc-b09d-dc9fc7c041cc-kube-api-access-q9wrv\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:41 crc kubenswrapper[4841]: I0313 09:16:41.581311 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-27 15:57:59.516158827 +0000 UTC Mar 13 09:16:41 crc kubenswrapper[4841]: I0313 09:16:41.581702 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6222h41m17.934460667s for next certificate rotation Mar 13 09:16:41 crc kubenswrapper[4841]: I0313 09:16:41.864730 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556554-2dgrd" event={"ID":"344d8ef8-1693-4dcc-b09d-dc9fc7c041cc","Type":"ContainerDied","Data":"6bdd5ab6e513bd07ea184c77959d7010dcf67516ee7e4f829ec7f70fe4d1c99f"} Mar 13 09:16:41 crc kubenswrapper[4841]: I0313 09:16:41.864779 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bdd5ab6e513bd07ea184c77959d7010dcf67516ee7e4f829ec7f70fe4d1c99f" Mar 13 09:16:41 crc kubenswrapper[4841]: I0313 09:16:41.864752 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556554-2dgrd" Mar 13 09:16:42 crc kubenswrapper[4841]: I0313 09:16:42.205614 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xxwmv" Mar 13 09:16:43 crc kubenswrapper[4841]: I0313 09:16:43.758870 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 09:16:43 crc kubenswrapper[4841]: E0313 09:16:43.759071 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="344d8ef8-1693-4dcc-b09d-dc9fc7c041cc" containerName="oc" Mar 13 09:16:43 crc kubenswrapper[4841]: I0313 09:16:43.759082 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="344d8ef8-1693-4dcc-b09d-dc9fc7c041cc" containerName="oc" Mar 13 09:16:43 crc kubenswrapper[4841]: I0313 09:16:43.759182 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="344d8ef8-1693-4dcc-b09d-dc9fc7c041cc" containerName="oc" Mar 13 09:16:43 crc kubenswrapper[4841]: I0313 09:16:43.760961 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 09:16:43 crc kubenswrapper[4841]: I0313 09:16:43.762997 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 09:16:43 crc kubenswrapper[4841]: I0313 09:16:43.763560 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 13 09:16:43 crc kubenswrapper[4841]: I0313 09:16:43.765402 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 09:16:43 crc kubenswrapper[4841]: I0313 09:16:43.899706 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae461fe9-c072-4b68-bebc-2070e622c5fe-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ae461fe9-c072-4b68-bebc-2070e622c5fe\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 09:16:43 crc kubenswrapper[4841]: I0313 09:16:43.899928 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae461fe9-c072-4b68-bebc-2070e622c5fe-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ae461fe9-c072-4b68-bebc-2070e622c5fe\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 09:16:44 crc kubenswrapper[4841]: I0313 09:16:44.001620 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae461fe9-c072-4b68-bebc-2070e622c5fe-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ae461fe9-c072-4b68-bebc-2070e622c5fe\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 09:16:44 crc kubenswrapper[4841]: I0313 09:16:44.001684 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae461fe9-c072-4b68-bebc-2070e622c5fe-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ae461fe9-c072-4b68-bebc-2070e622c5fe\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 09:16:44 crc kubenswrapper[4841]: I0313 09:16:44.001761 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae461fe9-c072-4b68-bebc-2070e622c5fe-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ae461fe9-c072-4b68-bebc-2070e622c5fe\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 09:16:44 crc kubenswrapper[4841]: I0313 09:16:44.019304 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae461fe9-c072-4b68-bebc-2070e622c5fe-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ae461fe9-c072-4b68-bebc-2070e622c5fe\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 09:16:44 crc kubenswrapper[4841]: I0313 09:16:44.081996 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 09:16:44 crc kubenswrapper[4841]: I0313 09:16:44.673283 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4"] Mar 13 09:16:44 crc kubenswrapper[4841]: I0313 09:16:44.673506 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" podUID="4a05c09f-d210-4def-9688-ab16f9b7c57b" containerName="controller-manager" containerID="cri-o://e40f5f5364e57a9a48df1c832062d0120cc27fd1c51e4fcb3eea6ec62e18ed77" gracePeriod=30 Mar 13 09:16:44 crc kubenswrapper[4841]: I0313 09:16:44.779484 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4"] Mar 13 09:16:44 crc kubenswrapper[4841]: I0313 09:16:44.779725 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" podUID="83e95392-2d00-4cf2-a34d-7a93508e9e84" containerName="route-controller-manager" containerID="cri-o://68fd0af67b1ac9dad337aa50367e6ec87cfda36cf7609ee36316248b6637ce0c" gracePeriod=30 Mar 13 09:16:44 crc kubenswrapper[4841]: I0313 09:16:44.881297 4841 generic.go:334] "Generic (PLEG): container finished" podID="4a05c09f-d210-4def-9688-ab16f9b7c57b" containerID="e40f5f5364e57a9a48df1c832062d0120cc27fd1c51e4fcb3eea6ec62e18ed77" exitCode=0 Mar 13 09:16:44 crc kubenswrapper[4841]: I0313 09:16:44.881347 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" event={"ID":"4a05c09f-d210-4def-9688-ab16f9b7c57b","Type":"ContainerDied","Data":"e40f5f5364e57a9a48df1c832062d0120cc27fd1c51e4fcb3eea6ec62e18ed77"} Mar 13 09:16:45 crc kubenswrapper[4841]: I0313 09:16:45.887000 4841 generic.go:334] "Generic (PLEG): container finished" podID="83e95392-2d00-4cf2-a34d-7a93508e9e84" containerID="68fd0af67b1ac9dad337aa50367e6ec87cfda36cf7609ee36316248b6637ce0c" exitCode=0 Mar 13 09:16:45 crc kubenswrapper[4841]: I0313 09:16:45.887046 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" event={"ID":"83e95392-2d00-4cf2-a34d-7a93508e9e84","Type":"ContainerDied","Data":"68fd0af67b1ac9dad337aa50367e6ec87cfda36cf7609ee36316248b6637ce0c"} Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.589439 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.621856 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf"] Mar 13 09:16:46 crc kubenswrapper[4841]: E0313 09:16:46.622389 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e95392-2d00-4cf2-a34d-7a93508e9e84" containerName="route-controller-manager" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.622404 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e95392-2d00-4cf2-a34d-7a93508e9e84" containerName="route-controller-manager" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.623560 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="83e95392-2d00-4cf2-a34d-7a93508e9e84" containerName="route-controller-manager" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.625412 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.631856 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf"] Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.633751 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83e95392-2d00-4cf2-a34d-7a93508e9e84-client-ca\") pod \"83e95392-2d00-4cf2-a34d-7a93508e9e84\" (UID: \"83e95392-2d00-4cf2-a34d-7a93508e9e84\") " Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.633865 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbksp\" (UniqueName: \"kubernetes.io/projected/83e95392-2d00-4cf2-a34d-7a93508e9e84-kube-api-access-jbksp\") pod \"83e95392-2d00-4cf2-a34d-7a93508e9e84\" (UID: \"83e95392-2d00-4cf2-a34d-7a93508e9e84\") " Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.635471 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e95392-2d00-4cf2-a34d-7a93508e9e84-client-ca" (OuterVolumeSpecName: "client-ca") pod "83e95392-2d00-4cf2-a34d-7a93508e9e84" (UID: "83e95392-2d00-4cf2-a34d-7a93508e9e84"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.643515 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83e95392-2d00-4cf2-a34d-7a93508e9e84-kube-api-access-jbksp" (OuterVolumeSpecName: "kube-api-access-jbksp") pod "83e95392-2d00-4cf2-a34d-7a93508e9e84" (UID: "83e95392-2d00-4cf2-a34d-7a93508e9e84"). InnerVolumeSpecName "kube-api-access-jbksp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.734870 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e95392-2d00-4cf2-a34d-7a93508e9e84-serving-cert\") pod \"83e95392-2d00-4cf2-a34d-7a93508e9e84\" (UID: \"83e95392-2d00-4cf2-a34d-7a93508e9e84\") " Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.734941 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e95392-2d00-4cf2-a34d-7a93508e9e84-config\") pod \"83e95392-2d00-4cf2-a34d-7a93508e9e84\" (UID: \"83e95392-2d00-4cf2-a34d-7a93508e9e84\") " Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.736768 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebfafea7-e749-4675-ac00-47a35d860c43-client-ca\") pod \"route-controller-manager-6c9657cf58-rnhgf\" (UID: \"ebfafea7-e749-4675-ac00-47a35d860c43\") " pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.736830 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5p98\" (UniqueName: \"kubernetes.io/projected/ebfafea7-e749-4675-ac00-47a35d860c43-kube-api-access-m5p98\") pod \"route-controller-manager-6c9657cf58-rnhgf\" (UID: \"ebfafea7-e749-4675-ac00-47a35d860c43\") " pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.736907 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebfafea7-e749-4675-ac00-47a35d860c43-serving-cert\") pod \"route-controller-manager-6c9657cf58-rnhgf\" (UID: \"ebfafea7-e749-4675-ac00-47a35d860c43\") " pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.736966 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebfafea7-e749-4675-ac00-47a35d860c43-config\") pod \"route-controller-manager-6c9657cf58-rnhgf\" (UID: \"ebfafea7-e749-4675-ac00-47a35d860c43\") " pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.737043 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83e95392-2d00-4cf2-a34d-7a93508e9e84-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.737071 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbksp\" (UniqueName: \"kubernetes.io/projected/83e95392-2d00-4cf2-a34d-7a93508e9e84-kube-api-access-jbksp\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.737632 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e95392-2d00-4cf2-a34d-7a93508e9e84-config" (OuterVolumeSpecName: "config") pod "83e95392-2d00-4cf2-a34d-7a93508e9e84" (UID: "83e95392-2d00-4cf2-a34d-7a93508e9e84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.752394 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83e95392-2d00-4cf2-a34d-7a93508e9e84-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "83e95392-2d00-4cf2-a34d-7a93508e9e84" (UID: "83e95392-2d00-4cf2-a34d-7a93508e9e84"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.838040 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebfafea7-e749-4675-ac00-47a35d860c43-client-ca\") pod \"route-controller-manager-6c9657cf58-rnhgf\" (UID: \"ebfafea7-e749-4675-ac00-47a35d860c43\") " pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.838096 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5p98\" (UniqueName: \"kubernetes.io/projected/ebfafea7-e749-4675-ac00-47a35d860c43-kube-api-access-m5p98\") pod \"route-controller-manager-6c9657cf58-rnhgf\" (UID: \"ebfafea7-e749-4675-ac00-47a35d860c43\") " pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.838138 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebfafea7-e749-4675-ac00-47a35d860c43-serving-cert\") pod \"route-controller-manager-6c9657cf58-rnhgf\" (UID: \"ebfafea7-e749-4675-ac00-47a35d860c43\") " pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.838165 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebfafea7-e749-4675-ac00-47a35d860c43-config\") pod \"route-controller-manager-6c9657cf58-rnhgf\" (UID: \"ebfafea7-e749-4675-ac00-47a35d860c43\") " pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.838219 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e95392-2d00-4cf2-a34d-7a93508e9e84-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.838231 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e95392-2d00-4cf2-a34d-7a93508e9e84-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.839198 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebfafea7-e749-4675-ac00-47a35d860c43-config\") pod \"route-controller-manager-6c9657cf58-rnhgf\" (UID: \"ebfafea7-e749-4675-ac00-47a35d860c43\") " pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.839736 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebfafea7-e749-4675-ac00-47a35d860c43-client-ca\") pod \"route-controller-manager-6c9657cf58-rnhgf\" (UID: \"ebfafea7-e749-4675-ac00-47a35d860c43\") " pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.843589 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebfafea7-e749-4675-ac00-47a35d860c43-serving-cert\") pod \"route-controller-manager-6c9657cf58-rnhgf\" (UID: \"ebfafea7-e749-4675-ac00-47a35d860c43\") " pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.855887 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5p98\" (UniqueName: \"kubernetes.io/projected/ebfafea7-e749-4675-ac00-47a35d860c43-kube-api-access-m5p98\") pod \"route-controller-manager-6c9657cf58-rnhgf\" (UID: \"ebfafea7-e749-4675-ac00-47a35d860c43\") " pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.859661 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.895240 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8khp" event={"ID":"13abb343-a494-425b-9b86-ea0d41df98a5","Type":"ContainerStarted","Data":"51bf5bf658e90b5af6d27a7826b90bd941236925e15d62555298ecb7c765e006"} Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.897704 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.897681 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4" event={"ID":"4a05c09f-d210-4def-9688-ab16f9b7c57b","Type":"ContainerDied","Data":"2a6ce9086bf80bbbc207a12610dbda6a0f86e2f1f129e740159a29246eb9ce16"} Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.898037 4841 scope.go:117] "RemoveContainer" containerID="e40f5f5364e57a9a48df1c832062d0120cc27fd1c51e4fcb3eea6ec62e18ed77" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.900596 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" event={"ID":"83e95392-2d00-4cf2-a34d-7a93508e9e84","Type":"ContainerDied","Data":"ff0f932267bc7c8642868a15e2631bb777fdd1130670871b59210cf1645c8582"} Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.900683 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.902485 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w95m9" event={"ID":"67d2682a-a65f-42e2-875a-b4247bfff054","Type":"ContainerStarted","Data":"40233fdc0ab03899e16347d7da27f8e50c5ae6b4982ae08f42c76b1e98e73c26"} Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.908389 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556556-c9nft" event={"ID":"f5961bba-4ec3-4b4e-b5a2-73aa1024326e","Type":"ContainerStarted","Data":"dc35df52e1205baf6d56760de75613f95cc272c8ed10cb53c388ebe487625b8a"} Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.923677 4841 scope.go:117] "RemoveContainer" containerID="68fd0af67b1ac9dad337aa50367e6ec87cfda36cf7609ee36316248b6637ce0c" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.928332 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.937221 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556556-c9nft" podStartSLOduration=5.03966283 podStartE2EDuration="46.937207102s" podCreationTimestamp="2026-03-13 09:16:00 +0000 UTC" firstStartedPulling="2026-03-13 09:16:04.492564094 +0000 UTC m=+247.222464285" lastFinishedPulling="2026-03-13 09:16:46.390108366 +0000 UTC m=+289.120008557" observedRunningTime="2026-03-13 09:16:46.933342065 +0000 UTC m=+289.663242256" watchObservedRunningTime="2026-03-13 09:16:46.937207102 +0000 UTC m=+289.667107293" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.938768 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a05c09f-d210-4def-9688-ab16f9b7c57b-config\") pod \"4a05c09f-d210-4def-9688-ab16f9b7c57b\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.938853 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a05c09f-d210-4def-9688-ab16f9b7c57b-serving-cert\") pod \"4a05c09f-d210-4def-9688-ab16f9b7c57b\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.938902 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r82gk\" (UniqueName: \"kubernetes.io/projected/4a05c09f-d210-4def-9688-ab16f9b7c57b-kube-api-access-r82gk\") pod \"4a05c09f-d210-4def-9688-ab16f9b7c57b\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.938920 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a05c09f-d210-4def-9688-ab16f9b7c57b-proxy-ca-bundles\") pod \"4a05c09f-d210-4def-9688-ab16f9b7c57b\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.938969 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a05c09f-d210-4def-9688-ab16f9b7c57b-client-ca\") pod \"4a05c09f-d210-4def-9688-ab16f9b7c57b\" (UID: \"4a05c09f-d210-4def-9688-ab16f9b7c57b\") " Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.940462 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a05c09f-d210-4def-9688-ab16f9b7c57b-config" (OuterVolumeSpecName: "config") pod "4a05c09f-d210-4def-9688-ab16f9b7c57b" (UID: "4a05c09f-d210-4def-9688-ab16f9b7c57b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.941087 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a05c09f-d210-4def-9688-ab16f9b7c57b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4a05c09f-d210-4def-9688-ab16f9b7c57b" (UID: "4a05c09f-d210-4def-9688-ab16f9b7c57b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.941253 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a05c09f-d210-4def-9688-ab16f9b7c57b-client-ca" (OuterVolumeSpecName: "client-ca") pod "4a05c09f-d210-4def-9688-ab16f9b7c57b" (UID: "4a05c09f-d210-4def-9688-ab16f9b7c57b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.943512 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a05c09f-d210-4def-9688-ab16f9b7c57b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4a05c09f-d210-4def-9688-ab16f9b7c57b" (UID: "4a05c09f-d210-4def-9688-ab16f9b7c57b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.943529 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a05c09f-d210-4def-9688-ab16f9b7c57b-kube-api-access-r82gk" (OuterVolumeSpecName: "kube-api-access-r82gk") pod "4a05c09f-d210-4def-9688-ab16f9b7c57b" (UID: "4a05c09f-d210-4def-9688-ab16f9b7c57b"). InnerVolumeSpecName "kube-api-access-r82gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.963244 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4"] Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.966975 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79c9b46b5-dzkt4"] Mar 13 09:16:46 crc kubenswrapper[4841]: I0313 09:16:46.967224 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.040672 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a05c09f-d210-4def-9688-ab16f9b7c57b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.041211 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r82gk\" (UniqueName: \"kubernetes.io/projected/4a05c09f-d210-4def-9688-ab16f9b7c57b-kube-api-access-r82gk\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.041227 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a05c09f-d210-4def-9688-ab16f9b7c57b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.041239 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a05c09f-d210-4def-9688-ab16f9b7c57b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.041252 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a05c09f-d210-4def-9688-ab16f9b7c57b-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.186888 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf"] Mar 13 09:16:47 crc kubenswrapper[4841]: W0313 09:16:47.223719 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebfafea7_e749_4675_ac00_47a35d860c43.slice/crio-6c72023cee4738cf6c5f2807da0b9e616845ec664f2b607481c2e3fa290594dd WatchSource:0}: Error finding container 6c72023cee4738cf6c5f2807da0b9e616845ec664f2b607481c2e3fa290594dd: Status 404 returned error can't find the container with id 6c72023cee4738cf6c5f2807da0b9e616845ec664f2b607481c2e3fa290594dd Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.236149 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4"] Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.239832 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6bcc56d7fb-98gd4"] Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.918660 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ae461fe9-c072-4b68-bebc-2070e622c5fe","Type":"ContainerStarted","Data":"00dca76d91d7025569930064c502bf8bdecb14c8a45a4c69cae1c3d020e4ff67"} Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.918952 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ae461fe9-c072-4b68-bebc-2070e622c5fe","Type":"ContainerStarted","Data":"eeb81e4b13d8fff0488140e7d355b91fb59f1fa4fdd503eb20ea3ef480036970"} Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.924112 4841 generic.go:334] "Generic (PLEG): container finished" podID="67d2682a-a65f-42e2-875a-b4247bfff054" containerID="40233fdc0ab03899e16347d7da27f8e50c5ae6b4982ae08f42c76b1e98e73c26" exitCode=0 Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.924176 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w95m9" event={"ID":"67d2682a-a65f-42e2-875a-b4247bfff054","Type":"ContainerDied","Data":"40233fdc0ab03899e16347d7da27f8e50c5ae6b4982ae08f42c76b1e98e73c26"} Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.926678 4841 generic.go:334] "Generic (PLEG): container finished" podID="f5961bba-4ec3-4b4e-b5a2-73aa1024326e" containerID="dc35df52e1205baf6d56760de75613f95cc272c8ed10cb53c388ebe487625b8a" exitCode=0 Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.926737 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556556-c9nft" event={"ID":"f5961bba-4ec3-4b4e-b5a2-73aa1024326e","Type":"ContainerDied","Data":"dc35df52e1205baf6d56760de75613f95cc272c8ed10cb53c388ebe487625b8a"} Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.936785 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" event={"ID":"ebfafea7-e749-4675-ac00-47a35d860c43","Type":"ContainerStarted","Data":"ea420a754cbc2d37c97e16f2ee3fb3f50328e4ed715687c6b04887fa547679c0"} Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.936819 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" event={"ID":"ebfafea7-e749-4675-ac00-47a35d860c43","Type":"ContainerStarted","Data":"6c72023cee4738cf6c5f2807da0b9e616845ec664f2b607481c2e3fa290594dd"} Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.937468 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.940684 4841 generic.go:334] "Generic (PLEG): container finished" podID="13abb343-a494-425b-9b86-ea0d41df98a5" containerID="51bf5bf658e90b5af6d27a7826b90bd941236925e15d62555298ecb7c765e006" exitCode=0 Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.940750 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8khp" event={"ID":"13abb343-a494-425b-9b86-ea0d41df98a5","Type":"ContainerDied","Data":"51bf5bf658e90b5af6d27a7826b90bd941236925e15d62555298ecb7c765e006"} Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.942075 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=4.942043247 podStartE2EDuration="4.942043247s" podCreationTimestamp="2026-03-13 09:16:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:47.937571631 +0000 UTC m=+290.667471822" watchObservedRunningTime="2026-03-13 09:16:47.942043247 +0000 UTC m=+290.671943458" Mar 13 09:16:47 crc kubenswrapper[4841]: I0313 09:16:47.967951 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" podStartSLOduration=3.9679300939999997 podStartE2EDuration="3.967930094s" podCreationTimestamp="2026-03-13 09:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:47.966782649 +0000 UTC m=+290.696682850" watchObservedRunningTime="2026-03-13 09:16:47.967930094 +0000 UTC m=+290.697830285" Mar 13 09:16:48 crc kubenswrapper[4841]: I0313 09:16:48.014553 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a05c09f-d210-4def-9688-ab16f9b7c57b" path="/var/lib/kubelet/pods/4a05c09f-d210-4def-9688-ab16f9b7c57b/volumes" Mar 13 09:16:48 crc kubenswrapper[4841]: I0313 09:16:48.015457 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83e95392-2d00-4cf2-a34d-7a93508e9e84" path="/var/lib/kubelet/pods/83e95392-2d00-4cf2-a34d-7a93508e9e84/volumes" Mar 13 09:16:48 crc kubenswrapper[4841]: I0313 09:16:48.016674 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" Mar 13 09:16:48 crc kubenswrapper[4841]: I0313 09:16:48.599425 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sbx6r" Mar 13 09:16:48 crc kubenswrapper[4841]: I0313 09:16:48.777746 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l6qx4" Mar 13 09:16:48 crc kubenswrapper[4841]: I0313 09:16:48.952436 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8khp" event={"ID":"13abb343-a494-425b-9b86-ea0d41df98a5","Type":"ContainerStarted","Data":"0e8739f1eb6e130b2421feaff486880af17b8096c6c929d6abbe264b98a8e555"} Mar 13 09:16:48 crc kubenswrapper[4841]: I0313 09:16:48.955012 4841 generic.go:334] "Generic (PLEG): container finished" podID="ae461fe9-c072-4b68-bebc-2070e622c5fe" containerID="00dca76d91d7025569930064c502bf8bdecb14c8a45a4c69cae1c3d020e4ff67" exitCode=0 Mar 13 09:16:48 crc kubenswrapper[4841]: I0313 09:16:48.955086 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ae461fe9-c072-4b68-bebc-2070e622c5fe","Type":"ContainerDied","Data":"00dca76d91d7025569930064c502bf8bdecb14c8a45a4c69cae1c3d020e4ff67"} Mar 13 09:16:48 crc kubenswrapper[4841]: I0313 09:16:48.959850 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w95m9" event={"ID":"67d2682a-a65f-42e2-875a-b4247bfff054","Type":"ContainerStarted","Data":"140aa11126c12015df8cca4639a6bd3d5f246e66664fbec13c6e3988d78ba0ed"} Mar 13 09:16:48 crc kubenswrapper[4841]: I0313 09:16:48.972821 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h8khp" podStartSLOduration=24.344646593 podStartE2EDuration="37.972805841s" podCreationTimestamp="2026-03-13 09:16:11 +0000 UTC" firstStartedPulling="2026-03-13 09:16:34.731500628 +0000 UTC m=+277.461400819" lastFinishedPulling="2026-03-13 09:16:48.359659876 +0000 UTC m=+291.089560067" observedRunningTime="2026-03-13 09:16:48.971866822 +0000 UTC m=+291.701767013" watchObservedRunningTime="2026-03-13 09:16:48.972805841 +0000 UTC m=+291.702706032" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.010791 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w95m9" podStartSLOduration=24.201781288 podStartE2EDuration="38.010775975s" podCreationTimestamp="2026-03-13 09:16:11 +0000 UTC" firstStartedPulling="2026-03-13 09:16:34.774830346 +0000 UTC m=+277.504730537" lastFinishedPulling="2026-03-13 09:16:48.583825033 +0000 UTC m=+291.313725224" observedRunningTime="2026-03-13 09:16:49.005451563 +0000 UTC m=+291.735351774" watchObservedRunningTime="2026-03-13 09:16:49.010775975 +0000 UTC m=+291.740676156" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.019738 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d4lvs" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.233307 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556556-c9nft" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.308472 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wlqmq" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.383117 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpr2t\" (UniqueName: \"kubernetes.io/projected/f5961bba-4ec3-4b4e-b5a2-73aa1024326e-kube-api-access-wpr2t\") pod \"f5961bba-4ec3-4b4e-b5a2-73aa1024326e\" (UID: \"f5961bba-4ec3-4b4e-b5a2-73aa1024326e\") " Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.393484 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5961bba-4ec3-4b4e-b5a2-73aa1024326e-kube-api-access-wpr2t" (OuterVolumeSpecName: "kube-api-access-wpr2t") pod "f5961bba-4ec3-4b4e-b5a2-73aa1024326e" (UID: "f5961bba-4ec3-4b4e-b5a2-73aa1024326e"). InnerVolumeSpecName "kube-api-access-wpr2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.417202 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86bc57d6db-gk5nh"] Mar 13 09:16:49 crc kubenswrapper[4841]: E0313 09:16:49.417423 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a05c09f-d210-4def-9688-ab16f9b7c57b" containerName="controller-manager" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.417436 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a05c09f-d210-4def-9688-ab16f9b7c57b" containerName="controller-manager" Mar 13 09:16:49 crc kubenswrapper[4841]: E0313 09:16:49.417449 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5961bba-4ec3-4b4e-b5a2-73aa1024326e" containerName="oc" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.417455 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5961bba-4ec3-4b4e-b5a2-73aa1024326e" containerName="oc" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.417545 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a05c09f-d210-4def-9688-ab16f9b7c57b" containerName="controller-manager" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.417562 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5961bba-4ec3-4b4e-b5a2-73aa1024326e" containerName="oc" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.417904 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.420228 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.420425 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.420547 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.420849 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.421406 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.424241 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.431106 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.474434 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86bc57d6db-gk5nh"] Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.485600 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08c65a93-7abe-48b3-8d0a-52c70de3d541-serving-cert\") pod \"controller-manager-86bc57d6db-gk5nh\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.485672 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08c65a93-7abe-48b3-8d0a-52c70de3d541-proxy-ca-bundles\") pod \"controller-manager-86bc57d6db-gk5nh\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.485731 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08c65a93-7abe-48b3-8d0a-52c70de3d541-client-ca\") pod \"controller-manager-86bc57d6db-gk5nh\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.485762 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08c65a93-7abe-48b3-8d0a-52c70de3d541-config\") pod \"controller-manager-86bc57d6db-gk5nh\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.485867 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pglxp\" (UniqueName: \"kubernetes.io/projected/08c65a93-7abe-48b3-8d0a-52c70de3d541-kube-api-access-pglxp\") pod \"controller-manager-86bc57d6db-gk5nh\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.485916 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpr2t\" (UniqueName: \"kubernetes.io/projected/f5961bba-4ec3-4b4e-b5a2-73aa1024326e-kube-api-access-wpr2t\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.587814 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08c65a93-7abe-48b3-8d0a-52c70de3d541-serving-cert\") pod \"controller-manager-86bc57d6db-gk5nh\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.587866 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08c65a93-7abe-48b3-8d0a-52c70de3d541-proxy-ca-bundles\") pod \"controller-manager-86bc57d6db-gk5nh\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.587912 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08c65a93-7abe-48b3-8d0a-52c70de3d541-client-ca\") pod \"controller-manager-86bc57d6db-gk5nh\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.587936 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08c65a93-7abe-48b3-8d0a-52c70de3d541-config\") pod \"controller-manager-86bc57d6db-gk5nh\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.587960 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pglxp\" (UniqueName: \"kubernetes.io/projected/08c65a93-7abe-48b3-8d0a-52c70de3d541-kube-api-access-pglxp\") pod \"controller-manager-86bc57d6db-gk5nh\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.589333 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08c65a93-7abe-48b3-8d0a-52c70de3d541-client-ca\") pod \"controller-manager-86bc57d6db-gk5nh\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.591862 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08c65a93-7abe-48b3-8d0a-52c70de3d541-proxy-ca-bundles\") pod \"controller-manager-86bc57d6db-gk5nh\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.606985 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08c65a93-7abe-48b3-8d0a-52c70de3d541-serving-cert\") pod \"controller-manager-86bc57d6db-gk5nh\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.607531 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08c65a93-7abe-48b3-8d0a-52c70de3d541-config\") pod \"controller-manager-86bc57d6db-gk5nh\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.609425 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pglxp\" (UniqueName: \"kubernetes.io/projected/08c65a93-7abe-48b3-8d0a-52c70de3d541-kube-api-access-pglxp\") pod \"controller-manager-86bc57d6db-gk5nh\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.736706 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.978308 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556556-c9nft" event={"ID":"f5961bba-4ec3-4b4e-b5a2-73aa1024326e","Type":"ContainerDied","Data":"23f6237f50f75bf08a38f50dae37c774516c746360ea9ba7dd28d0cdd2bdae37"} Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.978616 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23f6237f50f75bf08a38f50dae37c774516c746360ea9ba7dd28d0cdd2bdae37" Mar 13 09:16:49 crc kubenswrapper[4841]: I0313 09:16:49.978720 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556556-c9nft" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.078524 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86bc57d6db-gk5nh"] Mar 13 09:16:50 crc kubenswrapper[4841]: W0313 09:16:50.096554 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08c65a93_7abe_48b3_8d0a_52c70de3d541.slice/crio-dc23812ef86c68c2b16f4d944af374d4b43b5e4e5ba96e0c99fdd12edb224639 WatchSource:0}: Error finding container dc23812ef86c68c2b16f4d944af374d4b43b5e4e5ba96e0c99fdd12edb224639: Status 404 returned error can't find the container with id dc23812ef86c68c2b16f4d944af374d4b43b5e4e5ba96e0c99fdd12edb224639 Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.218715 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.302660 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae461fe9-c072-4b68-bebc-2070e622c5fe-kube-api-access\") pod \"ae461fe9-c072-4b68-bebc-2070e622c5fe\" (UID: \"ae461fe9-c072-4b68-bebc-2070e622c5fe\") " Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.304151 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae461fe9-c072-4b68-bebc-2070e622c5fe-kubelet-dir\") pod \"ae461fe9-c072-4b68-bebc-2070e622c5fe\" (UID: \"ae461fe9-c072-4b68-bebc-2070e622c5fe\") " Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.304193 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae461fe9-c072-4b68-bebc-2070e622c5fe-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ae461fe9-c072-4b68-bebc-2070e622c5fe" (UID: "ae461fe9-c072-4b68-bebc-2070e622c5fe"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.304767 4841 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae461fe9-c072-4b68-bebc-2070e622c5fe-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.308201 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae461fe9-c072-4b68-bebc-2070e622c5fe-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ae461fe9-c072-4b68-bebc-2070e622c5fe" (UID: "ae461fe9-c072-4b68-bebc-2070e622c5fe"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.406283 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae461fe9-c072-4b68-bebc-2070e622c5fe-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.551751 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 09:16:50 crc kubenswrapper[4841]: E0313 09:16:50.551949 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae461fe9-c072-4b68-bebc-2070e622c5fe" containerName="pruner" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.551959 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae461fe9-c072-4b68-bebc-2070e622c5fe" containerName="pruner" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.552061 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae461fe9-c072-4b68-bebc-2070e622c5fe" containerName="pruner" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.552415 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.571892 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.611385 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab5fd5c2-7ff5-4184-983f-6a47828ccf1a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.611734 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab5fd5c2-7ff5-4184-983f-6a47828ccf1a-var-lock\") pod \"installer-9-crc\" (UID: \"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.611906 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab5fd5c2-7ff5-4184-983f-6a47828ccf1a-kube-api-access\") pod \"installer-9-crc\" (UID: \"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.713076 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab5fd5c2-7ff5-4184-983f-6a47828ccf1a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.713316 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab5fd5c2-7ff5-4184-983f-6a47828ccf1a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.713510 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab5fd5c2-7ff5-4184-983f-6a47828ccf1a-var-lock\") pod \"installer-9-crc\" (UID: \"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.713567 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab5fd5c2-7ff5-4184-983f-6a47828ccf1a-kube-api-access\") pod \"installer-9-crc\" (UID: \"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.713685 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab5fd5c2-7ff5-4184-983f-6a47828ccf1a-var-lock\") pod \"installer-9-crc\" (UID: \"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.728898 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab5fd5c2-7ff5-4184-983f-6a47828ccf1a-kube-api-access\") pod \"installer-9-crc\" (UID: \"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.824623 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4gtt4" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.856882 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d4lvs"] Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.857105 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d4lvs" podUID="517b279c-79ee-438c-83c2-ae5cd25848fc" containerName="registry-server" containerID="cri-o://369db51f0191c31807cc317cb997a821a7a7319e8af980041eabc066f1c3abff" gracePeriod=2 Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.873948 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.988340 4841 generic.go:334] "Generic (PLEG): container finished" podID="517b279c-79ee-438c-83c2-ae5cd25848fc" containerID="369db51f0191c31807cc317cb997a821a7a7319e8af980041eabc066f1c3abff" exitCode=0 Mar 13 09:16:50 crc kubenswrapper[4841]: I0313 09:16:50.988539 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4lvs" event={"ID":"517b279c-79ee-438c-83c2-ae5cd25848fc","Type":"ContainerDied","Data":"369db51f0191c31807cc317cb997a821a7a7319e8af980041eabc066f1c3abff"} Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.008740 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" event={"ID":"08c65a93-7abe-48b3-8d0a-52c70de3d541","Type":"ContainerStarted","Data":"e3b9f6f098795c4eb3ac3be428ac3a376a55567caf8905e2a0e5f6dccbefe871"} Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.008776 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" event={"ID":"08c65a93-7abe-48b3-8d0a-52c70de3d541","Type":"ContainerStarted","Data":"dc23812ef86c68c2b16f4d944af374d4b43b5e4e5ba96e0c99fdd12edb224639"} Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.009748 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.011740 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ae461fe9-c072-4b68-bebc-2070e622c5fe","Type":"ContainerDied","Data":"eeb81e4b13d8fff0488140e7d355b91fb59f1fa4fdd503eb20ea3ef480036970"} Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.011769 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eeb81e4b13d8fff0488140e7d355b91fb59f1fa4fdd503eb20ea3ef480036970" Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.011822 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.023998 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.039082 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" podStartSLOduration=7.039067412 podStartE2EDuration="7.039067412s" podCreationTimestamp="2026-03-13 09:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:51.035853705 +0000 UTC m=+293.765753896" watchObservedRunningTime="2026-03-13 09:16:51.039067412 +0000 UTC m=+293.768967603" Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.212428 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qzfct" Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.351941 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4lvs" Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.384497 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 09:16:51 crc kubenswrapper[4841]: W0313 09:16:51.389155 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podab5fd5c2_7ff5_4184_983f_6a47828ccf1a.slice/crio-c29d66fd429029d1665b5d4e9642b97f80e9f6d1117dff866d55c1c7c028690a WatchSource:0}: Error finding container c29d66fd429029d1665b5d4e9642b97f80e9f6d1117dff866d55c1c7c028690a: Status 404 returned error can't find the container with id c29d66fd429029d1665b5d4e9642b97f80e9f6d1117dff866d55c1c7c028690a Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.428301 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/517b279c-79ee-438c-83c2-ae5cd25848fc-catalog-content\") pod \"517b279c-79ee-438c-83c2-ae5cd25848fc\" (UID: \"517b279c-79ee-438c-83c2-ae5cd25848fc\") " Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.428711 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql8rv\" (UniqueName: \"kubernetes.io/projected/517b279c-79ee-438c-83c2-ae5cd25848fc-kube-api-access-ql8rv\") pod \"517b279c-79ee-438c-83c2-ae5cd25848fc\" (UID: \"517b279c-79ee-438c-83c2-ae5cd25848fc\") " Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.428740 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/517b279c-79ee-438c-83c2-ae5cd25848fc-utilities\") pod \"517b279c-79ee-438c-83c2-ae5cd25848fc\" (UID: \"517b279c-79ee-438c-83c2-ae5cd25848fc\") " Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.429460 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/517b279c-79ee-438c-83c2-ae5cd25848fc-utilities" (OuterVolumeSpecName: "utilities") pod "517b279c-79ee-438c-83c2-ae5cd25848fc" (UID: "517b279c-79ee-438c-83c2-ae5cd25848fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.434775 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/517b279c-79ee-438c-83c2-ae5cd25848fc-kube-api-access-ql8rv" (OuterVolumeSpecName: "kube-api-access-ql8rv") pod "517b279c-79ee-438c-83c2-ae5cd25848fc" (UID: "517b279c-79ee-438c-83c2-ae5cd25848fc"). InnerVolumeSpecName "kube-api-access-ql8rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.483095 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/517b279c-79ee-438c-83c2-ae5cd25848fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "517b279c-79ee-438c-83c2-ae5cd25848fc" (UID: "517b279c-79ee-438c-83c2-ae5cd25848fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.530835 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql8rv\" (UniqueName: \"kubernetes.io/projected/517b279c-79ee-438c-83c2-ae5cd25848fc-kube-api-access-ql8rv\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.530868 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/517b279c-79ee-438c-83c2-ae5cd25848fc-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.530879 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/517b279c-79ee-438c-83c2-ae5cd25848fc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.732910 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w95m9" Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.732950 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w95m9" Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.859779 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wlqmq"] Mar 13 09:16:51 crc kubenswrapper[4841]: I0313 09:16:51.860383 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wlqmq" podUID="7f7766c0-82b7-4ee0-878d-47c5b0217e4d" containerName="registry-server" containerID="cri-o://8f59fcb21ea841ba8d62cc2a142bb209664c6ea93ed74da00903a982db4aab0f" gracePeriod=2 Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.031948 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a","Type":"ContainerStarted","Data":"a78937990f3981e3356f5709b3cffc5699041514bc6190b5d9ba94069bdd75d7"} Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.032045 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a","Type":"ContainerStarted","Data":"c29d66fd429029d1665b5d4e9642b97f80e9f6d1117dff866d55c1c7c028690a"} Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.045163 4841 generic.go:334] "Generic (PLEG): container finished" podID="7f7766c0-82b7-4ee0-878d-47c5b0217e4d" containerID="8f59fcb21ea841ba8d62cc2a142bb209664c6ea93ed74da00903a982db4aab0f" exitCode=0 Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.045229 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlqmq" event={"ID":"7f7766c0-82b7-4ee0-878d-47c5b0217e4d","Type":"ContainerDied","Data":"8f59fcb21ea841ba8d62cc2a142bb209664c6ea93ed74da00903a982db4aab0f"} Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.048036 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4lvs" Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.048038 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4lvs" event={"ID":"517b279c-79ee-438c-83c2-ae5cd25848fc","Type":"ContainerDied","Data":"e552413f54b98c9ccaed6791ea7968a8a00a409ed7be01c20fd8a562f2ed53a7"} Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.048105 4841 scope.go:117] "RemoveContainer" containerID="369db51f0191c31807cc317cb997a821a7a7319e8af980041eabc066f1c3abff" Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.063435 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.063416811 podStartE2EDuration="2.063416811s" podCreationTimestamp="2026-03-13 09:16:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:16:52.057031367 +0000 UTC m=+294.786931558" watchObservedRunningTime="2026-03-13 09:16:52.063416811 +0000 UTC m=+294.793317022" Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.077826 4841 scope.go:117] "RemoveContainer" containerID="b03afa1fb026276410b7093396e3630f13d294be9d63d4c49dc7465f30a2f95a" Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.105551 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d4lvs"] Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.108024 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d4lvs"] Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.110403 4841 scope.go:117] "RemoveContainer" containerID="eec84db7439935228706c1131b512e606be6719b442de2b9123f1fa6a89bedc9" Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.177538 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h8khp" Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.177892 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h8khp" Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.287964 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlqmq" Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.441418 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxzfr\" (UniqueName: \"kubernetes.io/projected/7f7766c0-82b7-4ee0-878d-47c5b0217e4d-kube-api-access-gxzfr\") pod \"7f7766c0-82b7-4ee0-878d-47c5b0217e4d\" (UID: \"7f7766c0-82b7-4ee0-878d-47c5b0217e4d\") " Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.441507 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f7766c0-82b7-4ee0-878d-47c5b0217e4d-utilities\") pod \"7f7766c0-82b7-4ee0-878d-47c5b0217e4d\" (UID: \"7f7766c0-82b7-4ee0-878d-47c5b0217e4d\") " Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.441562 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f7766c0-82b7-4ee0-878d-47c5b0217e4d-catalog-content\") pod \"7f7766c0-82b7-4ee0-878d-47c5b0217e4d\" (UID: \"7f7766c0-82b7-4ee0-878d-47c5b0217e4d\") " Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.443842 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f7766c0-82b7-4ee0-878d-47c5b0217e4d-utilities" (OuterVolumeSpecName: "utilities") pod "7f7766c0-82b7-4ee0-878d-47c5b0217e4d" (UID: "7f7766c0-82b7-4ee0-878d-47c5b0217e4d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.446057 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f7766c0-82b7-4ee0-878d-47c5b0217e4d-kube-api-access-gxzfr" (OuterVolumeSpecName: "kube-api-access-gxzfr") pod "7f7766c0-82b7-4ee0-878d-47c5b0217e4d" (UID: "7f7766c0-82b7-4ee0-878d-47c5b0217e4d"). InnerVolumeSpecName "kube-api-access-gxzfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.493483 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f7766c0-82b7-4ee0-878d-47c5b0217e4d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f7766c0-82b7-4ee0-878d-47c5b0217e4d" (UID: "7f7766c0-82b7-4ee0-878d-47c5b0217e4d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.543617 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f7766c0-82b7-4ee0-878d-47c5b0217e4d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.543657 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f7766c0-82b7-4ee0-878d-47c5b0217e4d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.543694 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxzfr\" (UniqueName: \"kubernetes.io/projected/7f7766c0-82b7-4ee0-878d-47c5b0217e4d-kube-api-access-gxzfr\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:52 crc kubenswrapper[4841]: I0313 09:16:52.794077 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w95m9" podUID="67d2682a-a65f-42e2-875a-b4247bfff054" containerName="registry-server" probeResult="failure" output=< Mar 13 09:16:52 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Mar 13 09:16:52 crc kubenswrapper[4841]: > Mar 13 09:16:53 crc kubenswrapper[4841]: I0313 09:16:53.057690 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlqmq" event={"ID":"7f7766c0-82b7-4ee0-878d-47c5b0217e4d","Type":"ContainerDied","Data":"937c6c449a7b90f48c3a6a435efd8ce8bd0b1dabd0c3c5f0636214322226bf19"} Mar 13 09:16:53 crc kubenswrapper[4841]: I0313 09:16:53.057781 4841 scope.go:117] "RemoveContainer" containerID="8f59fcb21ea841ba8d62cc2a142bb209664c6ea93ed74da00903a982db4aab0f" Mar 13 09:16:53 crc kubenswrapper[4841]: I0313 09:16:53.057728 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlqmq" Mar 13 09:16:53 crc kubenswrapper[4841]: I0313 09:16:53.071092 4841 scope.go:117] "RemoveContainer" containerID="fcdcf258036518ea9ef629f065db21629d1db17b0fbaf10efa2bc26f7636daca" Mar 13 09:16:53 crc kubenswrapper[4841]: I0313 09:16:53.094933 4841 scope.go:117] "RemoveContainer" containerID="54cd114d83c7967fd5c295d23e5631457503dc575f88907111b4150a51802749" Mar 13 09:16:53 crc kubenswrapper[4841]: I0313 09:16:53.096810 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wlqmq"] Mar 13 09:16:53 crc kubenswrapper[4841]: I0313 09:16:53.099178 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wlqmq"] Mar 13 09:16:53 crc kubenswrapper[4841]: I0313 09:16:53.212514 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h8khp" podUID="13abb343-a494-425b-9b86-ea0d41df98a5" containerName="registry-server" probeResult="failure" output=< Mar 13 09:16:53 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Mar 13 09:16:53 crc kubenswrapper[4841]: > Mar 13 09:16:54 crc kubenswrapper[4841]: I0313 09:16:54.002859 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="517b279c-79ee-438c-83c2-ae5cd25848fc" path="/var/lib/kubelet/pods/517b279c-79ee-438c-83c2-ae5cd25848fc/volumes" Mar 13 09:16:54 crc kubenswrapper[4841]: I0313 09:16:54.003686 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f7766c0-82b7-4ee0-878d-47c5b0217e4d" path="/var/lib/kubelet/pods/7f7766c0-82b7-4ee0-878d-47c5b0217e4d/volumes" Mar 13 09:16:54 crc kubenswrapper[4841]: I0313 09:16:54.256906 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzfct"] Mar 13 09:16:54 crc kubenswrapper[4841]: I0313 09:16:54.257154 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qzfct" podUID="f9de2815-2636-4e04-adf5-a7efd285781f" containerName="registry-server" containerID="cri-o://17a997825073552afb564eba1681dcbeed5e33a7eeb805e2e128ff05163bf97a" gracePeriod=2 Mar 13 09:16:54 crc kubenswrapper[4841]: I0313 09:16:54.646958 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzfct" Mar 13 09:16:54 crc kubenswrapper[4841]: I0313 09:16:54.769118 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9de2815-2636-4e04-adf5-a7efd285781f-catalog-content\") pod \"f9de2815-2636-4e04-adf5-a7efd285781f\" (UID: \"f9de2815-2636-4e04-adf5-a7efd285781f\") " Mar 13 09:16:54 crc kubenswrapper[4841]: I0313 09:16:54.769194 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9de2815-2636-4e04-adf5-a7efd285781f-utilities\") pod \"f9de2815-2636-4e04-adf5-a7efd285781f\" (UID: \"f9de2815-2636-4e04-adf5-a7efd285781f\") " Mar 13 09:16:54 crc kubenswrapper[4841]: I0313 09:16:54.769252 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crwfv\" (UniqueName: \"kubernetes.io/projected/f9de2815-2636-4e04-adf5-a7efd285781f-kube-api-access-crwfv\") pod \"f9de2815-2636-4e04-adf5-a7efd285781f\" (UID: \"f9de2815-2636-4e04-adf5-a7efd285781f\") " Mar 13 09:16:54 crc kubenswrapper[4841]: I0313 09:16:54.770621 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9de2815-2636-4e04-adf5-a7efd285781f-utilities" (OuterVolumeSpecName: "utilities") pod "f9de2815-2636-4e04-adf5-a7efd285781f" (UID: "f9de2815-2636-4e04-adf5-a7efd285781f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:16:54 crc kubenswrapper[4841]: I0313 09:16:54.777972 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9de2815-2636-4e04-adf5-a7efd285781f-kube-api-access-crwfv" (OuterVolumeSpecName: "kube-api-access-crwfv") pod "f9de2815-2636-4e04-adf5-a7efd285781f" (UID: "f9de2815-2636-4e04-adf5-a7efd285781f"). InnerVolumeSpecName "kube-api-access-crwfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:16:54 crc kubenswrapper[4841]: I0313 09:16:54.795882 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9de2815-2636-4e04-adf5-a7efd285781f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9de2815-2636-4e04-adf5-a7efd285781f" (UID: "f9de2815-2636-4e04-adf5-a7efd285781f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:16:54 crc kubenswrapper[4841]: I0313 09:16:54.870916 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9de2815-2636-4e04-adf5-a7efd285781f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:54 crc kubenswrapper[4841]: I0313 09:16:54.871147 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9de2815-2636-4e04-adf5-a7efd285781f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:54 crc kubenswrapper[4841]: I0313 09:16:54.871289 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crwfv\" (UniqueName: \"kubernetes.io/projected/f9de2815-2636-4e04-adf5-a7efd285781f-kube-api-access-crwfv\") on node \"crc\" DevicePath \"\"" Mar 13 09:16:55 crc kubenswrapper[4841]: I0313 09:16:55.073374 4841 generic.go:334] "Generic (PLEG): container finished" podID="f9de2815-2636-4e04-adf5-a7efd285781f" containerID="17a997825073552afb564eba1681dcbeed5e33a7eeb805e2e128ff05163bf97a" exitCode=0 Mar 13 09:16:55 crc kubenswrapper[4841]: I0313 09:16:55.073424 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzfct" event={"ID":"f9de2815-2636-4e04-adf5-a7efd285781f","Type":"ContainerDied","Data":"17a997825073552afb564eba1681dcbeed5e33a7eeb805e2e128ff05163bf97a"} Mar 13 09:16:55 crc kubenswrapper[4841]: I0313 09:16:55.073437 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzfct" Mar 13 09:16:55 crc kubenswrapper[4841]: I0313 09:16:55.073460 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzfct" event={"ID":"f9de2815-2636-4e04-adf5-a7efd285781f","Type":"ContainerDied","Data":"76ae43ebc1bcf252b67d0bc0a5d13e443c54af2a77f7838bca5d4ee9c04dd8e5"} Mar 13 09:16:55 crc kubenswrapper[4841]: I0313 09:16:55.073480 4841 scope.go:117] "RemoveContainer" containerID="17a997825073552afb564eba1681dcbeed5e33a7eeb805e2e128ff05163bf97a" Mar 13 09:16:55 crc kubenswrapper[4841]: I0313 09:16:55.087370 4841 scope.go:117] "RemoveContainer" containerID="6d99c66c31c7dc7895fa48b732ef6f73a99aae4ebfd586a43e402c50cd5aca81" Mar 13 09:16:55 crc kubenswrapper[4841]: I0313 09:16:55.106761 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzfct"] Mar 13 09:16:55 crc kubenswrapper[4841]: I0313 09:16:55.109549 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzfct"] Mar 13 09:16:55 crc kubenswrapper[4841]: I0313 09:16:55.111257 4841 scope.go:117] "RemoveContainer" containerID="5e3fea3f90a51f679f6cf953c1f6d8dd84315c7647500d72658c40876a0d7a63" Mar 13 09:16:55 crc kubenswrapper[4841]: I0313 09:16:55.135518 4841 scope.go:117] "RemoveContainer" containerID="17a997825073552afb564eba1681dcbeed5e33a7eeb805e2e128ff05163bf97a" Mar 13 09:16:55 crc kubenswrapper[4841]: E0313 09:16:55.136092 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17a997825073552afb564eba1681dcbeed5e33a7eeb805e2e128ff05163bf97a\": container with ID starting with 17a997825073552afb564eba1681dcbeed5e33a7eeb805e2e128ff05163bf97a not found: ID does not exist" containerID="17a997825073552afb564eba1681dcbeed5e33a7eeb805e2e128ff05163bf97a" Mar 13 09:16:55 crc kubenswrapper[4841]: I0313 09:16:55.136134 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a997825073552afb564eba1681dcbeed5e33a7eeb805e2e128ff05163bf97a"} err="failed to get container status \"17a997825073552afb564eba1681dcbeed5e33a7eeb805e2e128ff05163bf97a\": rpc error: code = NotFound desc = could not find container \"17a997825073552afb564eba1681dcbeed5e33a7eeb805e2e128ff05163bf97a\": container with ID starting with 17a997825073552afb564eba1681dcbeed5e33a7eeb805e2e128ff05163bf97a not found: ID does not exist" Mar 13 09:16:55 crc kubenswrapper[4841]: I0313 09:16:55.136165 4841 scope.go:117] "RemoveContainer" containerID="6d99c66c31c7dc7895fa48b732ef6f73a99aae4ebfd586a43e402c50cd5aca81" Mar 13 09:16:55 crc kubenswrapper[4841]: E0313 09:16:55.136717 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d99c66c31c7dc7895fa48b732ef6f73a99aae4ebfd586a43e402c50cd5aca81\": container with ID starting with 6d99c66c31c7dc7895fa48b732ef6f73a99aae4ebfd586a43e402c50cd5aca81 not found: ID does not exist" containerID="6d99c66c31c7dc7895fa48b732ef6f73a99aae4ebfd586a43e402c50cd5aca81" Mar 13 09:16:55 crc kubenswrapper[4841]: I0313 09:16:55.136750 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d99c66c31c7dc7895fa48b732ef6f73a99aae4ebfd586a43e402c50cd5aca81"} err="failed to get container status \"6d99c66c31c7dc7895fa48b732ef6f73a99aae4ebfd586a43e402c50cd5aca81\": rpc error: code = NotFound desc = could not find container \"6d99c66c31c7dc7895fa48b732ef6f73a99aae4ebfd586a43e402c50cd5aca81\": container with ID starting with 6d99c66c31c7dc7895fa48b732ef6f73a99aae4ebfd586a43e402c50cd5aca81 not found: ID does not exist" Mar 13 09:16:55 crc kubenswrapper[4841]: I0313 09:16:55.136772 4841 scope.go:117] "RemoveContainer" containerID="5e3fea3f90a51f679f6cf953c1f6d8dd84315c7647500d72658c40876a0d7a63" Mar 13 09:16:55 crc kubenswrapper[4841]: E0313 09:16:55.137362 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e3fea3f90a51f679f6cf953c1f6d8dd84315c7647500d72658c40876a0d7a63\": container with ID starting with 5e3fea3f90a51f679f6cf953c1f6d8dd84315c7647500d72658c40876a0d7a63 not found: ID does not exist" containerID="5e3fea3f90a51f679f6cf953c1f6d8dd84315c7647500d72658c40876a0d7a63" Mar 13 09:16:55 crc kubenswrapper[4841]: I0313 09:16:55.137390 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3fea3f90a51f679f6cf953c1f6d8dd84315c7647500d72658c40876a0d7a63"} err="failed to get container status \"5e3fea3f90a51f679f6cf953c1f6d8dd84315c7647500d72658c40876a0d7a63\": rpc error: code = NotFound desc = could not find container \"5e3fea3f90a51f679f6cf953c1f6d8dd84315c7647500d72658c40876a0d7a63\": container with ID starting with 5e3fea3f90a51f679f6cf953c1f6d8dd84315c7647500d72658c40876a0d7a63 not found: ID does not exist" Mar 13 09:16:56 crc kubenswrapper[4841]: I0313 09:16:56.002615 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9de2815-2636-4e04-adf5-a7efd285781f" path="/var/lib/kubelet/pods/f9de2815-2636-4e04-adf5-a7efd285781f/volumes" Mar 13 09:17:01 crc kubenswrapper[4841]: I0313 09:17:01.796914 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w95m9" Mar 13 09:17:01 crc kubenswrapper[4841]: I0313 09:17:01.873527 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w95m9" Mar 13 09:17:02 crc kubenswrapper[4841]: I0313 09:17:02.230875 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h8khp" Mar 13 09:17:02 crc kubenswrapper[4841]: I0313 09:17:02.284230 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h8khp" Mar 13 09:17:03 crc kubenswrapper[4841]: I0313 09:17:03.862411 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h8khp"] Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.128994 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h8khp" podUID="13abb343-a494-425b-9b86-ea0d41df98a5" containerName="registry-server" containerID="cri-o://0e8739f1eb6e130b2421feaff486880af17b8096c6c929d6abbe264b98a8e555" gracePeriod=2 Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.407917 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.407993 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.408074 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.409039 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653"} pod="openshift-machine-config-operator/machine-config-daemon-h227v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.409159 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" containerID="cri-o://9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653" gracePeriod=600 Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.630146 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8khp" Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.712389 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86bc57d6db-gk5nh"] Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.713354 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" podUID="08c65a93-7abe-48b3-8d0a-52c70de3d541" containerName="controller-manager" containerID="cri-o://e3b9f6f098795c4eb3ac3be428ac3a376a55567caf8905e2a0e5f6dccbefe871" gracePeriod=30 Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.741037 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf"] Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.741321 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" podUID="ebfafea7-e749-4675-ac00-47a35d860c43" containerName="route-controller-manager" containerID="cri-o://ea420a754cbc2d37c97e16f2ee3fb3f50328e4ed715687c6b04887fa547679c0" gracePeriod=30 Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.772890 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rf9b\" (UniqueName: \"kubernetes.io/projected/13abb343-a494-425b-9b86-ea0d41df98a5-kube-api-access-2rf9b\") pod \"13abb343-a494-425b-9b86-ea0d41df98a5\" (UID: \"13abb343-a494-425b-9b86-ea0d41df98a5\") " Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.772982 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13abb343-a494-425b-9b86-ea0d41df98a5-catalog-content\") pod \"13abb343-a494-425b-9b86-ea0d41df98a5\" (UID: \"13abb343-a494-425b-9b86-ea0d41df98a5\") " Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.773024 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13abb343-a494-425b-9b86-ea0d41df98a5-utilities\") pod \"13abb343-a494-425b-9b86-ea0d41df98a5\" (UID: \"13abb343-a494-425b-9b86-ea0d41df98a5\") " Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.773908 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13abb343-a494-425b-9b86-ea0d41df98a5-utilities" (OuterVolumeSpecName: "utilities") pod "13abb343-a494-425b-9b86-ea0d41df98a5" (UID: "13abb343-a494-425b-9b86-ea0d41df98a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.778791 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13abb343-a494-425b-9b86-ea0d41df98a5-kube-api-access-2rf9b" (OuterVolumeSpecName: "kube-api-access-2rf9b") pod "13abb343-a494-425b-9b86-ea0d41df98a5" (UID: "13abb343-a494-425b-9b86-ea0d41df98a5"). InnerVolumeSpecName "kube-api-access-2rf9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.873932 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rf9b\" (UniqueName: \"kubernetes.io/projected/13abb343-a494-425b-9b86-ea0d41df98a5-kube-api-access-2rf9b\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.873966 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13abb343-a494-425b-9b86-ea0d41df98a5-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.901236 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13abb343-a494-425b-9b86-ea0d41df98a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13abb343-a494-425b-9b86-ea0d41df98a5" (UID: "13abb343-a494-425b-9b86-ea0d41df98a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:17:04 crc kubenswrapper[4841]: I0313 09:17:04.976009 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13abb343-a494-425b-9b86-ea0d41df98a5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.134978 4841 generic.go:334] "Generic (PLEG): container finished" podID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerID="9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653" exitCode=0 Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.135040 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerDied","Data":"9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653"} Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.135066 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"d65f6d8d4c1b4e29b21c72e57a4e091f2bf10ce1c458bb3c65c2a8ccaf3e6167"} Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.136682 4841 generic.go:334] "Generic (PLEG): container finished" podID="08c65a93-7abe-48b3-8d0a-52c70de3d541" containerID="e3b9f6f098795c4eb3ac3be428ac3a376a55567caf8905e2a0e5f6dccbefe871" exitCode=0 Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.136770 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" event={"ID":"08c65a93-7abe-48b3-8d0a-52c70de3d541","Type":"ContainerDied","Data":"e3b9f6f098795c4eb3ac3be428ac3a376a55567caf8905e2a0e5f6dccbefe871"} Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.138628 4841 generic.go:334] "Generic (PLEG): container finished" podID="ebfafea7-e749-4675-ac00-47a35d860c43" containerID="ea420a754cbc2d37c97e16f2ee3fb3f50328e4ed715687c6b04887fa547679c0" exitCode=0 Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.138698 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" event={"ID":"ebfafea7-e749-4675-ac00-47a35d860c43","Type":"ContainerDied","Data":"ea420a754cbc2d37c97e16f2ee3fb3f50328e4ed715687c6b04887fa547679c0"} Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.140807 4841 generic.go:334] "Generic (PLEG): container finished" podID="13abb343-a494-425b-9b86-ea0d41df98a5" containerID="0e8739f1eb6e130b2421feaff486880af17b8096c6c929d6abbe264b98a8e555" exitCode=0 Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.140841 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8khp" event={"ID":"13abb343-a494-425b-9b86-ea0d41df98a5","Type":"ContainerDied","Data":"0e8739f1eb6e130b2421feaff486880af17b8096c6c929d6abbe264b98a8e555"} Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.140873 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h8khp" event={"ID":"13abb343-a494-425b-9b86-ea0d41df98a5","Type":"ContainerDied","Data":"a12f95e032e28ed064f789fdf64b382d0d48c8b49e324b60465ab4ce293c4a64"} Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.140887 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h8khp" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.140894 4841 scope.go:117] "RemoveContainer" containerID="0e8739f1eb6e130b2421feaff486880af17b8096c6c929d6abbe264b98a8e555" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.165195 4841 scope.go:117] "RemoveContainer" containerID="51bf5bf658e90b5af6d27a7826b90bd941236925e15d62555298ecb7c765e006" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.184241 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h8khp"] Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.187240 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h8khp"] Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.210731 4841 scope.go:117] "RemoveContainer" containerID="71692dbaa05668837496c12a68716a87feb2c96fc6d5009cfb7e34b6631db64a" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.221236 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.231094 4841 scope.go:117] "RemoveContainer" containerID="0e8739f1eb6e130b2421feaff486880af17b8096c6c929d6abbe264b98a8e555" Mar 13 09:17:05 crc kubenswrapper[4841]: E0313 09:17:05.233386 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e8739f1eb6e130b2421feaff486880af17b8096c6c929d6abbe264b98a8e555\": container with ID starting with 0e8739f1eb6e130b2421feaff486880af17b8096c6c929d6abbe264b98a8e555 not found: ID does not exist" containerID="0e8739f1eb6e130b2421feaff486880af17b8096c6c929d6abbe264b98a8e555" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.233489 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e8739f1eb6e130b2421feaff486880af17b8096c6c929d6abbe264b98a8e555"} err="failed to get container status \"0e8739f1eb6e130b2421feaff486880af17b8096c6c929d6abbe264b98a8e555\": rpc error: code = NotFound desc = could not find container \"0e8739f1eb6e130b2421feaff486880af17b8096c6c929d6abbe264b98a8e555\": container with ID starting with 0e8739f1eb6e130b2421feaff486880af17b8096c6c929d6abbe264b98a8e555 not found: ID does not exist" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.233985 4841 scope.go:117] "RemoveContainer" containerID="51bf5bf658e90b5af6d27a7826b90bd941236925e15d62555298ecb7c765e006" Mar 13 09:17:05 crc kubenswrapper[4841]: E0313 09:17:05.234891 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51bf5bf658e90b5af6d27a7826b90bd941236925e15d62555298ecb7c765e006\": container with ID starting with 51bf5bf658e90b5af6d27a7826b90bd941236925e15d62555298ecb7c765e006 not found: ID does not exist" containerID="51bf5bf658e90b5af6d27a7826b90bd941236925e15d62555298ecb7c765e006" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.234969 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51bf5bf658e90b5af6d27a7826b90bd941236925e15d62555298ecb7c765e006"} err="failed to get container status \"51bf5bf658e90b5af6d27a7826b90bd941236925e15d62555298ecb7c765e006\": rpc error: code = NotFound desc = could not find container \"51bf5bf658e90b5af6d27a7826b90bd941236925e15d62555298ecb7c765e006\": container with ID starting with 51bf5bf658e90b5af6d27a7826b90bd941236925e15d62555298ecb7c765e006 not found: ID does not exist" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.234999 4841 scope.go:117] "RemoveContainer" containerID="71692dbaa05668837496c12a68716a87feb2c96fc6d5009cfb7e34b6631db64a" Mar 13 09:17:05 crc kubenswrapper[4841]: E0313 09:17:05.243713 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71692dbaa05668837496c12a68716a87feb2c96fc6d5009cfb7e34b6631db64a\": container with ID starting with 71692dbaa05668837496c12a68716a87feb2c96fc6d5009cfb7e34b6631db64a not found: ID does not exist" containerID="71692dbaa05668837496c12a68716a87feb2c96fc6d5009cfb7e34b6631db64a" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.243759 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71692dbaa05668837496c12a68716a87feb2c96fc6d5009cfb7e34b6631db64a"} err="failed to get container status \"71692dbaa05668837496c12a68716a87feb2c96fc6d5009cfb7e34b6631db64a\": rpc error: code = NotFound desc = could not find container \"71692dbaa05668837496c12a68716a87feb2c96fc6d5009cfb7e34b6631db64a\": container with ID starting with 71692dbaa05668837496c12a68716a87feb2c96fc6d5009cfb7e34b6631db64a not found: ID does not exist" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.283462 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.386063 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pglxp\" (UniqueName: \"kubernetes.io/projected/08c65a93-7abe-48b3-8d0a-52c70de3d541-kube-api-access-pglxp\") pod \"08c65a93-7abe-48b3-8d0a-52c70de3d541\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.386107 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08c65a93-7abe-48b3-8d0a-52c70de3d541-serving-cert\") pod \"08c65a93-7abe-48b3-8d0a-52c70de3d541\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.386139 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5p98\" (UniqueName: \"kubernetes.io/projected/ebfafea7-e749-4675-ac00-47a35d860c43-kube-api-access-m5p98\") pod \"ebfafea7-e749-4675-ac00-47a35d860c43\" (UID: \"ebfafea7-e749-4675-ac00-47a35d860c43\") " Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.386159 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08c65a93-7abe-48b3-8d0a-52c70de3d541-config\") pod \"08c65a93-7abe-48b3-8d0a-52c70de3d541\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.386198 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebfafea7-e749-4675-ac00-47a35d860c43-serving-cert\") pod \"ebfafea7-e749-4675-ac00-47a35d860c43\" (UID: \"ebfafea7-e749-4675-ac00-47a35d860c43\") " Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.386219 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08c65a93-7abe-48b3-8d0a-52c70de3d541-proxy-ca-bundles\") pod \"08c65a93-7abe-48b3-8d0a-52c70de3d541\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.386250 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebfafea7-e749-4675-ac00-47a35d860c43-config\") pod \"ebfafea7-e749-4675-ac00-47a35d860c43\" (UID: \"ebfafea7-e749-4675-ac00-47a35d860c43\") " Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.386364 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebfafea7-e749-4675-ac00-47a35d860c43-client-ca\") pod \"ebfafea7-e749-4675-ac00-47a35d860c43\" (UID: \"ebfafea7-e749-4675-ac00-47a35d860c43\") " Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.386389 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08c65a93-7abe-48b3-8d0a-52c70de3d541-client-ca\") pod \"08c65a93-7abe-48b3-8d0a-52c70de3d541\" (UID: \"08c65a93-7abe-48b3-8d0a-52c70de3d541\") " Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.387033 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08c65a93-7abe-48b3-8d0a-52c70de3d541-client-ca" (OuterVolumeSpecName: "client-ca") pod "08c65a93-7abe-48b3-8d0a-52c70de3d541" (UID: "08c65a93-7abe-48b3-8d0a-52c70de3d541"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.387355 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08c65a93-7abe-48b3-8d0a-52c70de3d541-config" (OuterVolumeSpecName: "config") pod "08c65a93-7abe-48b3-8d0a-52c70de3d541" (UID: "08c65a93-7abe-48b3-8d0a-52c70de3d541"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.387369 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebfafea7-e749-4675-ac00-47a35d860c43-config" (OuterVolumeSpecName: "config") pod "ebfafea7-e749-4675-ac00-47a35d860c43" (UID: "ebfafea7-e749-4675-ac00-47a35d860c43"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.387552 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebfafea7-e749-4675-ac00-47a35d860c43-client-ca" (OuterVolumeSpecName: "client-ca") pod "ebfafea7-e749-4675-ac00-47a35d860c43" (UID: "ebfafea7-e749-4675-ac00-47a35d860c43"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.387589 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08c65a93-7abe-48b3-8d0a-52c70de3d541-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "08c65a93-7abe-48b3-8d0a-52c70de3d541" (UID: "08c65a93-7abe-48b3-8d0a-52c70de3d541"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.393449 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c65a93-7abe-48b3-8d0a-52c70de3d541-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "08c65a93-7abe-48b3-8d0a-52c70de3d541" (UID: "08c65a93-7abe-48b3-8d0a-52c70de3d541"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.393470 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebfafea7-e749-4675-ac00-47a35d860c43-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ebfafea7-e749-4675-ac00-47a35d860c43" (UID: "ebfafea7-e749-4675-ac00-47a35d860c43"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.393494 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c65a93-7abe-48b3-8d0a-52c70de3d541-kube-api-access-pglxp" (OuterVolumeSpecName: "kube-api-access-pglxp") pod "08c65a93-7abe-48b3-8d0a-52c70de3d541" (UID: "08c65a93-7abe-48b3-8d0a-52c70de3d541"). InnerVolumeSpecName "kube-api-access-pglxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.393479 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebfafea7-e749-4675-ac00-47a35d860c43-kube-api-access-m5p98" (OuterVolumeSpecName: "kube-api-access-m5p98") pod "ebfafea7-e749-4675-ac00-47a35d860c43" (UID: "ebfafea7-e749-4675-ac00-47a35d860c43"). InnerVolumeSpecName "kube-api-access-m5p98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.488008 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebfafea7-e749-4675-ac00-47a35d860c43-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.488043 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08c65a93-7abe-48b3-8d0a-52c70de3d541-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.488085 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pglxp\" (UniqueName: \"kubernetes.io/projected/08c65a93-7abe-48b3-8d0a-52c70de3d541-kube-api-access-pglxp\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.488099 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08c65a93-7abe-48b3-8d0a-52c70de3d541-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.488110 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5p98\" (UniqueName: \"kubernetes.io/projected/ebfafea7-e749-4675-ac00-47a35d860c43-kube-api-access-m5p98\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.488124 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08c65a93-7abe-48b3-8d0a-52c70de3d541-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.488134 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebfafea7-e749-4675-ac00-47a35d860c43-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.488146 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08c65a93-7abe-48b3-8d0a-52c70de3d541-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:05 crc kubenswrapper[4841]: I0313 09:17:05.488157 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebfafea7-e749-4675-ac00-47a35d860c43-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.008654 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13abb343-a494-425b-9b86-ea0d41df98a5" path="/var/lib/kubelet/pods/13abb343-a494-425b-9b86-ea0d41df98a5/volumes" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.152721 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" event={"ID":"08c65a93-7abe-48b3-8d0a-52c70de3d541","Type":"ContainerDied","Data":"dc23812ef86c68c2b16f4d944af374d4b43b5e4e5ba96e0c99fdd12edb224639"} Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.152752 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86bc57d6db-gk5nh" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.152828 4841 scope.go:117] "RemoveContainer" containerID="e3b9f6f098795c4eb3ac3be428ac3a376a55567caf8905e2a0e5f6dccbefe871" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.155816 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" event={"ID":"ebfafea7-e749-4675-ac00-47a35d860c43","Type":"ContainerDied","Data":"6c72023cee4738cf6c5f2807da0b9e616845ec664f2b607481c2e3fa290594dd"} Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.155894 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.188032 4841 scope.go:117] "RemoveContainer" containerID="ea420a754cbc2d37c97e16f2ee3fb3f50328e4ed715687c6b04887fa547679c0" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.192438 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf"] Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.207416 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c9657cf58-rnhgf"] Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.215006 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86bc57d6db-gk5nh"] Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.222171 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-86bc57d6db-gk5nh"] Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.438308 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8"] Mar 13 09:17:06 crc kubenswrapper[4841]: E0313 09:17:06.438766 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f7766c0-82b7-4ee0-878d-47c5b0217e4d" containerName="extract-utilities" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.438789 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f7766c0-82b7-4ee0-878d-47c5b0217e4d" containerName="extract-utilities" Mar 13 09:17:06 crc kubenswrapper[4841]: E0313 09:17:06.438810 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="517b279c-79ee-438c-83c2-ae5cd25848fc" containerName="registry-server" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.438819 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="517b279c-79ee-438c-83c2-ae5cd25848fc" containerName="registry-server" Mar 13 09:17:06 crc kubenswrapper[4841]: E0313 09:17:06.438828 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9de2815-2636-4e04-adf5-a7efd285781f" containerName="registry-server" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.438839 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9de2815-2636-4e04-adf5-a7efd285781f" containerName="registry-server" Mar 13 09:17:06 crc kubenswrapper[4841]: E0313 09:17:06.438856 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9de2815-2636-4e04-adf5-a7efd285781f" containerName="extract-content" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.438864 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9de2815-2636-4e04-adf5-a7efd285781f" containerName="extract-content" Mar 13 09:17:06 crc kubenswrapper[4841]: E0313 09:17:06.438888 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f7766c0-82b7-4ee0-878d-47c5b0217e4d" containerName="extract-content" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.438896 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f7766c0-82b7-4ee0-878d-47c5b0217e4d" containerName="extract-content" Mar 13 09:17:06 crc kubenswrapper[4841]: E0313 09:17:06.438910 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f7766c0-82b7-4ee0-878d-47c5b0217e4d" containerName="registry-server" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.438918 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f7766c0-82b7-4ee0-878d-47c5b0217e4d" containerName="registry-server" Mar 13 09:17:06 crc kubenswrapper[4841]: E0313 09:17:06.438935 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13abb343-a494-425b-9b86-ea0d41df98a5" containerName="extract-content" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.438944 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="13abb343-a494-425b-9b86-ea0d41df98a5" containerName="extract-content" Mar 13 09:17:06 crc kubenswrapper[4841]: E0313 09:17:06.438964 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="517b279c-79ee-438c-83c2-ae5cd25848fc" containerName="extract-utilities" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.438972 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="517b279c-79ee-438c-83c2-ae5cd25848fc" containerName="extract-utilities" Mar 13 09:17:06 crc kubenswrapper[4841]: E0313 09:17:06.438991 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9de2815-2636-4e04-adf5-a7efd285781f" containerName="extract-utilities" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.439000 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9de2815-2636-4e04-adf5-a7efd285781f" containerName="extract-utilities" Mar 13 09:17:06 crc kubenswrapper[4841]: E0313 09:17:06.439016 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebfafea7-e749-4675-ac00-47a35d860c43" containerName="route-controller-manager" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.439025 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebfafea7-e749-4675-ac00-47a35d860c43" containerName="route-controller-manager" Mar 13 09:17:06 crc kubenswrapper[4841]: E0313 09:17:06.439035 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c65a93-7abe-48b3-8d0a-52c70de3d541" containerName="controller-manager" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.439043 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c65a93-7abe-48b3-8d0a-52c70de3d541" containerName="controller-manager" Mar 13 09:17:06 crc kubenswrapper[4841]: E0313 09:17:06.439061 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13abb343-a494-425b-9b86-ea0d41df98a5" containerName="extract-utilities" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.439069 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="13abb343-a494-425b-9b86-ea0d41df98a5" containerName="extract-utilities" Mar 13 09:17:06 crc kubenswrapper[4841]: E0313 09:17:06.439087 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="517b279c-79ee-438c-83c2-ae5cd25848fc" containerName="extract-content" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.439095 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="517b279c-79ee-438c-83c2-ae5cd25848fc" containerName="extract-content" Mar 13 09:17:06 crc kubenswrapper[4841]: E0313 09:17:06.439113 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13abb343-a494-425b-9b86-ea0d41df98a5" containerName="registry-server" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.439121 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="13abb343-a494-425b-9b86-ea0d41df98a5" containerName="registry-server" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.439393 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="13abb343-a494-425b-9b86-ea0d41df98a5" containerName="registry-server" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.439419 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9de2815-2636-4e04-adf5-a7efd285781f" containerName="registry-server" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.439442 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f7766c0-82b7-4ee0-878d-47c5b0217e4d" containerName="registry-server" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.439460 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c65a93-7abe-48b3-8d0a-52c70de3d541" containerName="controller-manager" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.439470 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebfafea7-e749-4675-ac00-47a35d860c43" containerName="route-controller-manager" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.439487 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="517b279c-79ee-438c-83c2-ae5cd25848fc" containerName="registry-server" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.440123 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.440198 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-9997d5676-dlc4b"] Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.441186 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.443195 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.448178 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.450512 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.450672 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.450524 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.452253 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.452574 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.452749 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.454845 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.455220 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.457828 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.458177 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.468950 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9997d5676-dlc4b"] Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.469306 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.481184 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8"] Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.601750 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57b9b66f-d8be-40f7-9bf2-70b7748241ac-serving-cert\") pod \"controller-manager-9997d5676-dlc4b\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.601821 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnbh7\" (UniqueName: \"kubernetes.io/projected/bdfacc10-a90c-4d3d-a817-2881ec892360-kube-api-access-qnbh7\") pod \"route-controller-manager-599c5895cb-6rtz8\" (UID: \"bdfacc10-a90c-4d3d-a817-2881ec892360\") " pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.601864 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdfacc10-a90c-4d3d-a817-2881ec892360-client-ca\") pod \"route-controller-manager-599c5895cb-6rtz8\" (UID: \"bdfacc10-a90c-4d3d-a817-2881ec892360\") " pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.601908 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdfacc10-a90c-4d3d-a817-2881ec892360-serving-cert\") pod \"route-controller-manager-599c5895cb-6rtz8\" (UID: \"bdfacc10-a90c-4d3d-a817-2881ec892360\") " pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.601988 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdspd\" (UniqueName: \"kubernetes.io/projected/57b9b66f-d8be-40f7-9bf2-70b7748241ac-kube-api-access-tdspd\") pod \"controller-manager-9997d5676-dlc4b\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.602134 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b9b66f-d8be-40f7-9bf2-70b7748241ac-config\") pod \"controller-manager-9997d5676-dlc4b\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.602230 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdfacc10-a90c-4d3d-a817-2881ec892360-config\") pod \"route-controller-manager-599c5895cb-6rtz8\" (UID: \"bdfacc10-a90c-4d3d-a817-2881ec892360\") " pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.602286 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57b9b66f-d8be-40f7-9bf2-70b7748241ac-client-ca\") pod \"controller-manager-9997d5676-dlc4b\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.602349 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57b9b66f-d8be-40f7-9bf2-70b7748241ac-proxy-ca-bundles\") pod \"controller-manager-9997d5676-dlc4b\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.703443 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b9b66f-d8be-40f7-9bf2-70b7748241ac-config\") pod \"controller-manager-9997d5676-dlc4b\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.703510 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdfacc10-a90c-4d3d-a817-2881ec892360-config\") pod \"route-controller-manager-599c5895cb-6rtz8\" (UID: \"bdfacc10-a90c-4d3d-a817-2881ec892360\") " pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.703557 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57b9b66f-d8be-40f7-9bf2-70b7748241ac-client-ca\") pod \"controller-manager-9997d5676-dlc4b\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.703589 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57b9b66f-d8be-40f7-9bf2-70b7748241ac-proxy-ca-bundles\") pod \"controller-manager-9997d5676-dlc4b\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.703642 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57b9b66f-d8be-40f7-9bf2-70b7748241ac-serving-cert\") pod \"controller-manager-9997d5676-dlc4b\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.703679 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnbh7\" (UniqueName: \"kubernetes.io/projected/bdfacc10-a90c-4d3d-a817-2881ec892360-kube-api-access-qnbh7\") pod \"route-controller-manager-599c5895cb-6rtz8\" (UID: \"bdfacc10-a90c-4d3d-a817-2881ec892360\") " pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.703714 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdfacc10-a90c-4d3d-a817-2881ec892360-client-ca\") pod \"route-controller-manager-599c5895cb-6rtz8\" (UID: \"bdfacc10-a90c-4d3d-a817-2881ec892360\") " pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.703759 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdfacc10-a90c-4d3d-a817-2881ec892360-serving-cert\") pod \"route-controller-manager-599c5895cb-6rtz8\" (UID: \"bdfacc10-a90c-4d3d-a817-2881ec892360\") " pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.703795 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdspd\" (UniqueName: \"kubernetes.io/projected/57b9b66f-d8be-40f7-9bf2-70b7748241ac-kube-api-access-tdspd\") pod \"controller-manager-9997d5676-dlc4b\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.704670 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57b9b66f-d8be-40f7-9bf2-70b7748241ac-client-ca\") pod \"controller-manager-9997d5676-dlc4b\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.704933 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57b9b66f-d8be-40f7-9bf2-70b7748241ac-proxy-ca-bundles\") pod \"controller-manager-9997d5676-dlc4b\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.705251 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdfacc10-a90c-4d3d-a817-2881ec892360-client-ca\") pod \"route-controller-manager-599c5895cb-6rtz8\" (UID: \"bdfacc10-a90c-4d3d-a817-2881ec892360\") " pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.705594 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b9b66f-d8be-40f7-9bf2-70b7748241ac-config\") pod \"controller-manager-9997d5676-dlc4b\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.706774 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdfacc10-a90c-4d3d-a817-2881ec892360-config\") pod \"route-controller-manager-599c5895cb-6rtz8\" (UID: \"bdfacc10-a90c-4d3d-a817-2881ec892360\") " pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.710582 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57b9b66f-d8be-40f7-9bf2-70b7748241ac-serving-cert\") pod \"controller-manager-9997d5676-dlc4b\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.711368 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdfacc10-a90c-4d3d-a817-2881ec892360-serving-cert\") pod \"route-controller-manager-599c5895cb-6rtz8\" (UID: \"bdfacc10-a90c-4d3d-a817-2881ec892360\") " pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.734815 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnbh7\" (UniqueName: \"kubernetes.io/projected/bdfacc10-a90c-4d3d-a817-2881ec892360-kube-api-access-qnbh7\") pod \"route-controller-manager-599c5895cb-6rtz8\" (UID: \"bdfacc10-a90c-4d3d-a817-2881ec892360\") " pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.737643 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdspd\" (UniqueName: \"kubernetes.io/projected/57b9b66f-d8be-40f7-9bf2-70b7748241ac-kube-api-access-tdspd\") pod \"controller-manager-9997d5676-dlc4b\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.778840 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.790777 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:06 crc kubenswrapper[4841]: I0313 09:17:06.985800 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8"] Mar 13 09:17:07 crc kubenswrapper[4841]: I0313 09:17:07.163542 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" event={"ID":"bdfacc10-a90c-4d3d-a817-2881ec892360","Type":"ContainerStarted","Data":"787f90f11ef99c4afefb84727a05a0d55bd26b38e72e7704f6348b5e5836644d"} Mar 13 09:17:07 crc kubenswrapper[4841]: I0313 09:17:07.163583 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" event={"ID":"bdfacc10-a90c-4d3d-a817-2881ec892360","Type":"ContainerStarted","Data":"ab145cb46c4462d9207ef0174722f4070bda67a480aa6efb2b5967c98a5c9fb7"} Mar 13 09:17:07 crc kubenswrapper[4841]: I0313 09:17:07.164383 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" Mar 13 09:17:07 crc kubenswrapper[4841]: I0313 09:17:07.185190 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" podStartSLOduration=3.185175657 podStartE2EDuration="3.185175657s" podCreationTimestamp="2026-03-13 09:17:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:17:07.184121635 +0000 UTC m=+309.914021836" watchObservedRunningTime="2026-03-13 09:17:07.185175657 +0000 UTC m=+309.915075848" Mar 13 09:17:07 crc kubenswrapper[4841]: I0313 09:17:07.263323 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9997d5676-dlc4b"] Mar 13 09:17:07 crc kubenswrapper[4841]: W0313 09:17:07.268167 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b9b66f_d8be_40f7_9bf2_70b7748241ac.slice/crio-2084d371ff3acbcca07cd9e56d5a88d05d3e87466f9a45057b94d63208fedf5f WatchSource:0}: Error finding container 2084d371ff3acbcca07cd9e56d5a88d05d3e87466f9a45057b94d63208fedf5f: Status 404 returned error can't find the container with id 2084d371ff3acbcca07cd9e56d5a88d05d3e87466f9a45057b94d63208fedf5f Mar 13 09:17:07 crc kubenswrapper[4841]: I0313 09:17:07.501380 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" Mar 13 09:17:08 crc kubenswrapper[4841]: I0313 09:17:08.005596 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c65a93-7abe-48b3-8d0a-52c70de3d541" path="/var/lib/kubelet/pods/08c65a93-7abe-48b3-8d0a-52c70de3d541/volumes" Mar 13 09:17:08 crc kubenswrapper[4841]: I0313 09:17:08.006479 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebfafea7-e749-4675-ac00-47a35d860c43" path="/var/lib/kubelet/pods/ebfafea7-e749-4675-ac00-47a35d860c43/volumes" Mar 13 09:17:08 crc kubenswrapper[4841]: I0313 09:17:08.170803 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" event={"ID":"57b9b66f-d8be-40f7-9bf2-70b7748241ac","Type":"ContainerStarted","Data":"2115c835145a35f0cb379270a35a3ee84e44aa6aea00ee7bd8b867721adc54a7"} Mar 13 09:17:08 crc kubenswrapper[4841]: I0313 09:17:08.170840 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" event={"ID":"57b9b66f-d8be-40f7-9bf2-70b7748241ac","Type":"ContainerStarted","Data":"2084d371ff3acbcca07cd9e56d5a88d05d3e87466f9a45057b94d63208fedf5f"} Mar 13 09:17:08 crc kubenswrapper[4841]: I0313 09:17:08.171115 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:08 crc kubenswrapper[4841]: I0313 09:17:08.175859 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:08 crc kubenswrapper[4841]: I0313 09:17:08.190296 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" podStartSLOduration=4.190278091 podStartE2EDuration="4.190278091s" podCreationTimestamp="2026-03-13 09:17:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:17:08.189681112 +0000 UTC m=+310.919581323" watchObservedRunningTime="2026-03-13 09:17:08.190278091 +0000 UTC m=+310.920178282" Mar 13 09:17:10 crc kubenswrapper[4841]: I0313 09:17:10.573001 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zsbmr"] Mar 13 09:17:12 crc kubenswrapper[4841]: I0313 09:17:12.029369 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 09:17:24 crc kubenswrapper[4841]: I0313 09:17:24.733460 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9997d5676-dlc4b"] Mar 13 09:17:24 crc kubenswrapper[4841]: I0313 09:17:24.734386 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" podUID="57b9b66f-d8be-40f7-9bf2-70b7748241ac" containerName="controller-manager" containerID="cri-o://2115c835145a35f0cb379270a35a3ee84e44aa6aea00ee7bd8b867721adc54a7" gracePeriod=30 Mar 13 09:17:24 crc kubenswrapper[4841]: I0313 09:17:24.826257 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8"] Mar 13 09:17:24 crc kubenswrapper[4841]: I0313 09:17:24.826587 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" podUID="bdfacc10-a90c-4d3d-a817-2881ec892360" containerName="route-controller-manager" containerID="cri-o://787f90f11ef99c4afefb84727a05a0d55bd26b38e72e7704f6348b5e5836644d" gracePeriod=30 Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.277662 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.279775 4841 generic.go:334] "Generic (PLEG): container finished" podID="57b9b66f-d8be-40f7-9bf2-70b7748241ac" containerID="2115c835145a35f0cb379270a35a3ee84e44aa6aea00ee7bd8b867721adc54a7" exitCode=0 Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.279835 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" event={"ID":"57b9b66f-d8be-40f7-9bf2-70b7748241ac","Type":"ContainerDied","Data":"2115c835145a35f0cb379270a35a3ee84e44aa6aea00ee7bd8b867721adc54a7"} Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.279859 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" event={"ID":"57b9b66f-d8be-40f7-9bf2-70b7748241ac","Type":"ContainerDied","Data":"2084d371ff3acbcca07cd9e56d5a88d05d3e87466f9a45057b94d63208fedf5f"} Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.279869 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2084d371ff3acbcca07cd9e56d5a88d05d3e87466f9a45057b94d63208fedf5f" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.281120 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.281852 4841 generic.go:334] "Generic (PLEG): container finished" podID="bdfacc10-a90c-4d3d-a817-2881ec892360" containerID="787f90f11ef99c4afefb84727a05a0d55bd26b38e72e7704f6348b5e5836644d" exitCode=0 Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.281890 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" event={"ID":"bdfacc10-a90c-4d3d-a817-2881ec892360","Type":"ContainerDied","Data":"787f90f11ef99c4afefb84727a05a0d55bd26b38e72e7704f6348b5e5836644d"} Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.281926 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" event={"ID":"bdfacc10-a90c-4d3d-a817-2881ec892360","Type":"ContainerDied","Data":"ab145cb46c4462d9207ef0174722f4070bda67a480aa6efb2b5967c98a5c9fb7"} Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.281932 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.281942 4841 scope.go:117] "RemoveContainer" containerID="787f90f11ef99c4afefb84727a05a0d55bd26b38e72e7704f6348b5e5836644d" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.305902 4841 scope.go:117] "RemoveContainer" containerID="787f90f11ef99c4afefb84727a05a0d55bd26b38e72e7704f6348b5e5836644d" Mar 13 09:17:25 crc kubenswrapper[4841]: E0313 09:17:25.309470 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"787f90f11ef99c4afefb84727a05a0d55bd26b38e72e7704f6348b5e5836644d\": container with ID starting with 787f90f11ef99c4afefb84727a05a0d55bd26b38e72e7704f6348b5e5836644d not found: ID does not exist" containerID="787f90f11ef99c4afefb84727a05a0d55bd26b38e72e7704f6348b5e5836644d" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.309518 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"787f90f11ef99c4afefb84727a05a0d55bd26b38e72e7704f6348b5e5836644d"} err="failed to get container status \"787f90f11ef99c4afefb84727a05a0d55bd26b38e72e7704f6348b5e5836644d\": rpc error: code = NotFound desc = could not find container \"787f90f11ef99c4afefb84727a05a0d55bd26b38e72e7704f6348b5e5836644d\": container with ID starting with 787f90f11ef99c4afefb84727a05a0d55bd26b38e72e7704f6348b5e5836644d not found: ID does not exist" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.417294 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57b9b66f-d8be-40f7-9bf2-70b7748241ac-serving-cert\") pod \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.417349 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57b9b66f-d8be-40f7-9bf2-70b7748241ac-proxy-ca-bundles\") pod \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.417374 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdfacc10-a90c-4d3d-a817-2881ec892360-client-ca\") pod \"bdfacc10-a90c-4d3d-a817-2881ec892360\" (UID: \"bdfacc10-a90c-4d3d-a817-2881ec892360\") " Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.417408 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b9b66f-d8be-40f7-9bf2-70b7748241ac-config\") pod \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.417443 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdfacc10-a90c-4d3d-a817-2881ec892360-config\") pod \"bdfacc10-a90c-4d3d-a817-2881ec892360\" (UID: \"bdfacc10-a90c-4d3d-a817-2881ec892360\") " Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.417469 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdspd\" (UniqueName: \"kubernetes.io/projected/57b9b66f-d8be-40f7-9bf2-70b7748241ac-kube-api-access-tdspd\") pod \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.417495 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnbh7\" (UniqueName: \"kubernetes.io/projected/bdfacc10-a90c-4d3d-a817-2881ec892360-kube-api-access-qnbh7\") pod \"bdfacc10-a90c-4d3d-a817-2881ec892360\" (UID: \"bdfacc10-a90c-4d3d-a817-2881ec892360\") " Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.417529 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57b9b66f-d8be-40f7-9bf2-70b7748241ac-client-ca\") pod \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\" (UID: \"57b9b66f-d8be-40f7-9bf2-70b7748241ac\") " Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.417578 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdfacc10-a90c-4d3d-a817-2881ec892360-serving-cert\") pod \"bdfacc10-a90c-4d3d-a817-2881ec892360\" (UID: \"bdfacc10-a90c-4d3d-a817-2881ec892360\") " Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.418210 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdfacc10-a90c-4d3d-a817-2881ec892360-client-ca" (OuterVolumeSpecName: "client-ca") pod "bdfacc10-a90c-4d3d-a817-2881ec892360" (UID: "bdfacc10-a90c-4d3d-a817-2881ec892360"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.418873 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdfacc10-a90c-4d3d-a817-2881ec892360-config" (OuterVolumeSpecName: "config") pod "bdfacc10-a90c-4d3d-a817-2881ec892360" (UID: "bdfacc10-a90c-4d3d-a817-2881ec892360"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.418907 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57b9b66f-d8be-40f7-9bf2-70b7748241ac-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "57b9b66f-d8be-40f7-9bf2-70b7748241ac" (UID: "57b9b66f-d8be-40f7-9bf2-70b7748241ac"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.418944 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57b9b66f-d8be-40f7-9bf2-70b7748241ac-config" (OuterVolumeSpecName: "config") pod "57b9b66f-d8be-40f7-9bf2-70b7748241ac" (UID: "57b9b66f-d8be-40f7-9bf2-70b7748241ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.419051 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57b9b66f-d8be-40f7-9bf2-70b7748241ac-client-ca" (OuterVolumeSpecName: "client-ca") pod "57b9b66f-d8be-40f7-9bf2-70b7748241ac" (UID: "57b9b66f-d8be-40f7-9bf2-70b7748241ac"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.422852 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57b9b66f-d8be-40f7-9bf2-70b7748241ac-kube-api-access-tdspd" (OuterVolumeSpecName: "kube-api-access-tdspd") pod "57b9b66f-d8be-40f7-9bf2-70b7748241ac" (UID: "57b9b66f-d8be-40f7-9bf2-70b7748241ac"). InnerVolumeSpecName "kube-api-access-tdspd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.423189 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdfacc10-a90c-4d3d-a817-2881ec892360-kube-api-access-qnbh7" (OuterVolumeSpecName: "kube-api-access-qnbh7") pod "bdfacc10-a90c-4d3d-a817-2881ec892360" (UID: "bdfacc10-a90c-4d3d-a817-2881ec892360"). InnerVolumeSpecName "kube-api-access-qnbh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.423353 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdfacc10-a90c-4d3d-a817-2881ec892360-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bdfacc10-a90c-4d3d-a817-2881ec892360" (UID: "bdfacc10-a90c-4d3d-a817-2881ec892360"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.426439 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57b9b66f-d8be-40f7-9bf2-70b7748241ac-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "57b9b66f-d8be-40f7-9bf2-70b7748241ac" (UID: "57b9b66f-d8be-40f7-9bf2-70b7748241ac"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.518571 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdfacc10-a90c-4d3d-a817-2881ec892360-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.518609 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdspd\" (UniqueName: \"kubernetes.io/projected/57b9b66f-d8be-40f7-9bf2-70b7748241ac-kube-api-access-tdspd\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.518620 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnbh7\" (UniqueName: \"kubernetes.io/projected/bdfacc10-a90c-4d3d-a817-2881ec892360-kube-api-access-qnbh7\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.518629 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57b9b66f-d8be-40f7-9bf2-70b7748241ac-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.518638 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdfacc10-a90c-4d3d-a817-2881ec892360-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.518645 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57b9b66f-d8be-40f7-9bf2-70b7748241ac-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.518653 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57b9b66f-d8be-40f7-9bf2-70b7748241ac-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.518662 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bdfacc10-a90c-4d3d-a817-2881ec892360-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.518670 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b9b66f-d8be-40f7-9bf2-70b7748241ac-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.625775 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8"] Mar 13 09:17:25 crc kubenswrapper[4841]: I0313 09:17:25.628718 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-599c5895cb-6rtz8"] Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.005893 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdfacc10-a90c-4d3d-a817-2881ec892360" path="/var/lib/kubelet/pods/bdfacc10-a90c-4d3d-a817-2881ec892360/volumes" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.290054 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9997d5676-dlc4b" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.308090 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9997d5676-dlc4b"] Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.311420 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-9997d5676-dlc4b"] Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.446243 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h"] Mar 13 09:17:26 crc kubenswrapper[4841]: E0313 09:17:26.446985 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b9b66f-d8be-40f7-9bf2-70b7748241ac" containerName="controller-manager" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.447015 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b9b66f-d8be-40f7-9bf2-70b7748241ac" containerName="controller-manager" Mar 13 09:17:26 crc kubenswrapper[4841]: E0313 09:17:26.447044 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdfacc10-a90c-4d3d-a817-2881ec892360" containerName="route-controller-manager" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.447058 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdfacc10-a90c-4d3d-a817-2881ec892360" containerName="route-controller-manager" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.447237 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="57b9b66f-d8be-40f7-9bf2-70b7748241ac" containerName="controller-manager" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.447294 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdfacc10-a90c-4d3d-a817-2881ec892360" containerName="route-controller-manager" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.447934 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.448848 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6db76d9f69-c8gf2"] Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.449622 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.458825 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.462014 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.462501 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.462808 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.463092 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.464778 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.465097 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.466480 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.467353 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.468249 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.469215 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.471245 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.472045 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h"] Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.472833 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.478729 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6db76d9f69-c8gf2"] Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.537152 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce87b469-e7d3-497c-890f-9824b49efaca-client-ca\") pod \"controller-manager-6db76d9f69-c8gf2\" (UID: \"ce87b469-e7d3-497c-890f-9824b49efaca\") " pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.537232 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6cb9\" (UniqueName: \"kubernetes.io/projected/ce87b469-e7d3-497c-890f-9824b49efaca-kube-api-access-l6cb9\") pod \"controller-manager-6db76d9f69-c8gf2\" (UID: \"ce87b469-e7d3-497c-890f-9824b49efaca\") " pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.537552 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec644631-a55d-424c-b6bc-bf320b885d5d-config\") pod \"route-controller-manager-77f57f8cf9-mx67h\" (UID: \"ec644631-a55d-424c-b6bc-bf320b885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.537729 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce87b469-e7d3-497c-890f-9824b49efaca-config\") pod \"controller-manager-6db76d9f69-c8gf2\" (UID: \"ce87b469-e7d3-497c-890f-9824b49efaca\") " pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.537899 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce87b469-e7d3-497c-890f-9824b49efaca-serving-cert\") pod \"controller-manager-6db76d9f69-c8gf2\" (UID: \"ce87b469-e7d3-497c-890f-9824b49efaca\") " pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.537991 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec644631-a55d-424c-b6bc-bf320b885d5d-client-ca\") pod \"route-controller-manager-77f57f8cf9-mx67h\" (UID: \"ec644631-a55d-424c-b6bc-bf320b885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.538351 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce87b469-e7d3-497c-890f-9824b49efaca-proxy-ca-bundles\") pod \"controller-manager-6db76d9f69-c8gf2\" (UID: \"ce87b469-e7d3-497c-890f-9824b49efaca\") " pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.538513 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh27m\" (UniqueName: \"kubernetes.io/projected/ec644631-a55d-424c-b6bc-bf320b885d5d-kube-api-access-sh27m\") pod \"route-controller-manager-77f57f8cf9-mx67h\" (UID: \"ec644631-a55d-424c-b6bc-bf320b885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.538782 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec644631-a55d-424c-b6bc-bf320b885d5d-serving-cert\") pod \"route-controller-manager-77f57f8cf9-mx67h\" (UID: \"ec644631-a55d-424c-b6bc-bf320b885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.640556 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce87b469-e7d3-497c-890f-9824b49efaca-config\") pod \"controller-manager-6db76d9f69-c8gf2\" (UID: \"ce87b469-e7d3-497c-890f-9824b49efaca\") " pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.640691 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce87b469-e7d3-497c-890f-9824b49efaca-serving-cert\") pod \"controller-manager-6db76d9f69-c8gf2\" (UID: \"ce87b469-e7d3-497c-890f-9824b49efaca\") " pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.640967 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec644631-a55d-424c-b6bc-bf320b885d5d-client-ca\") pod \"route-controller-manager-77f57f8cf9-mx67h\" (UID: \"ec644631-a55d-424c-b6bc-bf320b885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.641929 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce87b469-e7d3-497c-890f-9824b49efaca-proxy-ca-bundles\") pod \"controller-manager-6db76d9f69-c8gf2\" (UID: \"ce87b469-e7d3-497c-890f-9824b49efaca\") " pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.642008 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh27m\" (UniqueName: \"kubernetes.io/projected/ec644631-a55d-424c-b6bc-bf320b885d5d-kube-api-access-sh27m\") pod \"route-controller-manager-77f57f8cf9-mx67h\" (UID: \"ec644631-a55d-424c-b6bc-bf320b885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.642092 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec644631-a55d-424c-b6bc-bf320b885d5d-serving-cert\") pod \"route-controller-manager-77f57f8cf9-mx67h\" (UID: \"ec644631-a55d-424c-b6bc-bf320b885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.642144 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce87b469-e7d3-497c-890f-9824b49efaca-client-ca\") pod \"controller-manager-6db76d9f69-c8gf2\" (UID: \"ce87b469-e7d3-497c-890f-9824b49efaca\") " pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.642183 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6cb9\" (UniqueName: \"kubernetes.io/projected/ce87b469-e7d3-497c-890f-9824b49efaca-kube-api-access-l6cb9\") pod \"controller-manager-6db76d9f69-c8gf2\" (UID: \"ce87b469-e7d3-497c-890f-9824b49efaca\") " pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.642240 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec644631-a55d-424c-b6bc-bf320b885d5d-config\") pod \"route-controller-manager-77f57f8cf9-mx67h\" (UID: \"ec644631-a55d-424c-b6bc-bf320b885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.642597 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec644631-a55d-424c-b6bc-bf320b885d5d-client-ca\") pod \"route-controller-manager-77f57f8cf9-mx67h\" (UID: \"ec644631-a55d-424c-b6bc-bf320b885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.643059 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce87b469-e7d3-497c-890f-9824b49efaca-proxy-ca-bundles\") pod \"controller-manager-6db76d9f69-c8gf2\" (UID: \"ce87b469-e7d3-497c-890f-9824b49efaca\") " pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.643426 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce87b469-e7d3-497c-890f-9824b49efaca-config\") pod \"controller-manager-6db76d9f69-c8gf2\" (UID: \"ce87b469-e7d3-497c-890f-9824b49efaca\") " pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.644215 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec644631-a55d-424c-b6bc-bf320b885d5d-config\") pod \"route-controller-manager-77f57f8cf9-mx67h\" (UID: \"ec644631-a55d-424c-b6bc-bf320b885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.645463 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce87b469-e7d3-497c-890f-9824b49efaca-client-ca\") pod \"controller-manager-6db76d9f69-c8gf2\" (UID: \"ce87b469-e7d3-497c-890f-9824b49efaca\") " pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.654357 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce87b469-e7d3-497c-890f-9824b49efaca-serving-cert\") pod \"controller-manager-6db76d9f69-c8gf2\" (UID: \"ce87b469-e7d3-497c-890f-9824b49efaca\") " pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.658586 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec644631-a55d-424c-b6bc-bf320b885d5d-serving-cert\") pod \"route-controller-manager-77f57f8cf9-mx67h\" (UID: \"ec644631-a55d-424c-b6bc-bf320b885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.674781 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh27m\" (UniqueName: \"kubernetes.io/projected/ec644631-a55d-424c-b6bc-bf320b885d5d-kube-api-access-sh27m\") pod \"route-controller-manager-77f57f8cf9-mx67h\" (UID: \"ec644631-a55d-424c-b6bc-bf320b885d5d\") " pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.679419 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6cb9\" (UniqueName: \"kubernetes.io/projected/ce87b469-e7d3-497c-890f-9824b49efaca-kube-api-access-l6cb9\") pod \"controller-manager-6db76d9f69-c8gf2\" (UID: \"ce87b469-e7d3-497c-890f-9824b49efaca\") " pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.787887 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" Mar 13 09:17:26 crc kubenswrapper[4841]: I0313 09:17:26.803055 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:27 crc kubenswrapper[4841]: I0313 09:17:27.039748 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h"] Mar 13 09:17:27 crc kubenswrapper[4841]: I0313 09:17:27.300789 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" event={"ID":"ec644631-a55d-424c-b6bc-bf320b885d5d","Type":"ContainerStarted","Data":"d3778e74995464a70f0a447b187db2c909cb1ce76fda94d242809e43c84da093"} Mar 13 09:17:27 crc kubenswrapper[4841]: I0313 09:17:27.300833 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" event={"ID":"ec644631-a55d-424c-b6bc-bf320b885d5d","Type":"ContainerStarted","Data":"cb17685ad2e18b997ba24116a44443e2cd7de887b7e25efd162b7ddf258a8c97"} Mar 13 09:17:27 crc kubenswrapper[4841]: I0313 09:17:27.301060 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" Mar 13 09:17:27 crc kubenswrapper[4841]: I0313 09:17:27.319449 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6db76d9f69-c8gf2"] Mar 13 09:17:27 crc kubenswrapper[4841]: I0313 09:17:27.772953 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" Mar 13 09:17:27 crc kubenswrapper[4841]: I0313 09:17:27.786676 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-77f57f8cf9-mx67h" podStartSLOduration=3.786662324 podStartE2EDuration="3.786662324s" podCreationTimestamp="2026-03-13 09:17:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:17:27.331996798 +0000 UTC m=+330.061896989" watchObservedRunningTime="2026-03-13 09:17:27.786662324 +0000 UTC m=+330.516562505" Mar 13 09:17:28 crc kubenswrapper[4841]: I0313 09:17:28.005186 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57b9b66f-d8be-40f7-9bf2-70b7748241ac" path="/var/lib/kubelet/pods/57b9b66f-d8be-40f7-9bf2-70b7748241ac/volumes" Mar 13 09:17:28 crc kubenswrapper[4841]: I0313 09:17:28.309549 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" event={"ID":"ce87b469-e7d3-497c-890f-9824b49efaca","Type":"ContainerStarted","Data":"f3e33796721aa38dc57f7e254363ff3de6e69fc8b6e3071b0fc00b6dbf1f4fb5"} Mar 13 09:17:28 crc kubenswrapper[4841]: I0313 09:17:28.309637 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" event={"ID":"ce87b469-e7d3-497c-890f-9824b49efaca","Type":"ContainerStarted","Data":"086de2ee35df072671ea219993d8ae6e34321a2d06d265e1f312b9abdf2474a2"} Mar 13 09:17:28 crc kubenswrapper[4841]: I0313 09:17:28.336935 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" podStartSLOduration=4.336907245 podStartE2EDuration="4.336907245s" podCreationTimestamp="2026-03-13 09:17:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:17:28.326025375 +0000 UTC m=+331.055925586" watchObservedRunningTime="2026-03-13 09:17:28.336907245 +0000 UTC m=+331.066807466" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.296902 4841 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.298402 4841 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.298627 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.298917 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f" gracePeriod=15 Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.299022 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc" gracePeriod=15 Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.299122 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be" gracePeriod=15 Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.299190 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://258c37d4d3a4e8eb1c14a049165aca26c6a6883ad7a7b3c83f01439c9035baf4" gracePeriod=15 Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.299233 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c" gracePeriod=15 Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.300961 4841 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 09:17:29 crc kubenswrapper[4841]: E0313 09:17:29.301208 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.301227 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 09:17:29 crc kubenswrapper[4841]: E0313 09:17:29.301243 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.301257 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 09:17:29 crc kubenswrapper[4841]: E0313 09:17:29.301308 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.301325 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 09:17:29 crc kubenswrapper[4841]: E0313 09:17:29.301349 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.301360 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 09:17:29 crc kubenswrapper[4841]: E0313 09:17:29.301381 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.301396 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 13 09:17:29 crc kubenswrapper[4841]: E0313 09:17:29.301415 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.301427 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 09:17:29 crc kubenswrapper[4841]: E0313 09:17:29.302671 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.302698 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 09:17:29 crc kubenswrapper[4841]: E0313 09:17:29.302713 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.302726 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 09:17:29 crc kubenswrapper[4841]: E0313 09:17:29.302743 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.302756 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 09:17:29 crc kubenswrapper[4841]: E0313 09:17:29.302773 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.302785 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.302967 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.302983 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.302996 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.303011 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.303028 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.303045 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.303065 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 09:17:29 crc kubenswrapper[4841]: E0313 09:17:29.303362 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.303379 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.303625 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.303649 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.303672 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.341126 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.351283 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" Mar 13 09:17:29 crc kubenswrapper[4841]: E0313 09:17:29.376719 4841 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.106:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.476686 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.476768 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.476797 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.476826 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.476852 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.476900 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.476921 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.476980 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.578259 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.578366 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.578400 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.578425 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.578448 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.578519 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.578658 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.578519 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.578513 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.578730 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.578766 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.578893 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.579149 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.579222 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.579309 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.579372 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: I0313 09:17:29.678332 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:29 crc kubenswrapper[4841]: W0313 09:17:29.710506 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-806af6d4884271468a318ba8477f15075e5b942493ed646b4dd4cc537a175553 WatchSource:0}: Error finding container 806af6d4884271468a318ba8477f15075e5b942493ed646b4dd4cc537a175553: Status 404 returned error can't find the container with id 806af6d4884271468a318ba8477f15075e5b942493ed646b4dd4cc537a175553 Mar 13 09:17:29 crc kubenswrapper[4841]: E0313 09:17:29.714422 4841 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.106:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189c5bf25931b3fd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:17:29.713775613 +0000 UTC m=+332.443675804,LastTimestamp:2026-03-13 09:17:29.713775613 +0000 UTC m=+332.443675804,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:17:30 crc kubenswrapper[4841]: I0313 09:17:30.350849 4841 generic.go:334] "Generic (PLEG): container finished" podID="ab5fd5c2-7ff5-4184-983f-6a47828ccf1a" containerID="a78937990f3981e3356f5709b3cffc5699041514bc6190b5d9ba94069bdd75d7" exitCode=0 Mar 13 09:17:30 crc kubenswrapper[4841]: I0313 09:17:30.351066 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a","Type":"ContainerDied","Data":"a78937990f3981e3356f5709b3cffc5699041514bc6190b5d9ba94069bdd75d7"} Mar 13 09:17:30 crc kubenswrapper[4841]: I0313 09:17:30.353683 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"79b8a5c311cd6792650b6ceffeb98e86d08bb151dd93d911aa1621f4eedc6b5b"} Mar 13 09:17:30 crc kubenswrapper[4841]: I0313 09:17:30.353746 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"806af6d4884271468a318ba8477f15075e5b942493ed646b4dd4cc537a175553"} Mar 13 09:17:30 crc kubenswrapper[4841]: E0313 09:17:30.354634 4841 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.106:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:17:30 crc kubenswrapper[4841]: I0313 09:17:30.355563 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 13 09:17:30 crc kubenswrapper[4841]: I0313 09:17:30.356715 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 09:17:30 crc kubenswrapper[4841]: I0313 09:17:30.357528 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="258c37d4d3a4e8eb1c14a049165aca26c6a6883ad7a7b3c83f01439c9035baf4" exitCode=0 Mar 13 09:17:30 crc kubenswrapper[4841]: I0313 09:17:30.357562 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc" exitCode=0 Mar 13 09:17:30 crc kubenswrapper[4841]: I0313 09:17:30.357597 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be" exitCode=0 Mar 13 09:17:30 crc kubenswrapper[4841]: I0313 09:17:30.357606 4841 scope.go:117] "RemoveContainer" containerID="b3f16bb916c74fd59d4cf4d54bf3fec08c7e808af1a73d7593b53eed22b4d9ec" Mar 13 09:17:30 crc kubenswrapper[4841]: I0313 09:17:30.357619 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c" exitCode=2 Mar 13 09:17:30 crc kubenswrapper[4841]: E0313 09:17:30.496800 4841 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:30 crc kubenswrapper[4841]: E0313 09:17:30.497467 4841 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:30 crc kubenswrapper[4841]: E0313 09:17:30.497817 4841 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:30 crc kubenswrapper[4841]: E0313 09:17:30.498377 4841 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:30 crc kubenswrapper[4841]: E0313 09:17:30.498924 4841 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:30 crc kubenswrapper[4841]: I0313 09:17:30.499017 4841 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 13 09:17:30 crc kubenswrapper[4841]: E0313 09:17:30.499688 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="200ms" Mar 13 09:17:30 crc kubenswrapper[4841]: E0313 09:17:30.701601 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="400ms" Mar 13 09:17:31 crc kubenswrapper[4841]: E0313 09:17:31.107373 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="800ms" Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.373146 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.766654 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.769745 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.770657 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:31 crc kubenswrapper[4841]: E0313 09:17:31.907988 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="1.6s" Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.929583 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.929660 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.929685 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.929704 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab5fd5c2-7ff5-4184-983f-6a47828ccf1a-var-lock\") pod \"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a\" (UID: \"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a\") " Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.929729 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab5fd5c2-7ff5-4184-983f-6a47828ccf1a-kube-api-access\") pod \"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a\" (UID: \"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a\") " Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.929699 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.929758 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab5fd5c2-7ff5-4184-983f-6a47828ccf1a-kubelet-dir\") pod \"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a\" (UID: \"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a\") " Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.929785 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.929889 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab5fd5c2-7ff5-4184-983f-6a47828ccf1a-var-lock" (OuterVolumeSpecName: "var-lock") pod "ab5fd5c2-7ff5-4184-983f-6a47828ccf1a" (UID: "ab5fd5c2-7ff5-4184-983f-6a47828ccf1a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.929794 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab5fd5c2-7ff5-4184-983f-6a47828ccf1a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ab5fd5c2-7ff5-4184-983f-6a47828ccf1a" (UID: "ab5fd5c2-7ff5-4184-983f-6a47828ccf1a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.929809 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.930828 4841 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab5fd5c2-7ff5-4184-983f-6a47828ccf1a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.930889 4841 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.930905 4841 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.930920 4841 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.930935 4841 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab5fd5c2-7ff5-4184-983f-6a47828ccf1a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:31 crc kubenswrapper[4841]: I0313 09:17:31.937533 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab5fd5c2-7ff5-4184-983f-6a47828ccf1a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ab5fd5c2-7ff5-4184-983f-6a47828ccf1a" (UID: "ab5fd5c2-7ff5-4184-983f-6a47828ccf1a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.004803 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.032093 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab5fd5c2-7ff5-4184-983f-6a47828ccf1a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:32 crc kubenswrapper[4841]: E0313 09:17:32.083808 4841 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.106:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189c5bf25931b3fd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 09:17:29.713775613 +0000 UTC m=+332.443675804,LastTimestamp:2026-03-13 09:17:29.713775613 +0000 UTC m=+332.443675804,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.383807 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.385208 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f" exitCode=0 Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.385313 4841 scope.go:117] "RemoveContainer" containerID="258c37d4d3a4e8eb1c14a049165aca26c6a6883ad7a7b3c83f01439c9035baf4" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.385365 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.387920 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.387895 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ab5fd5c2-7ff5-4184-983f-6a47828ccf1a","Type":"ContainerDied","Data":"c29d66fd429029d1665b5d4e9642b97f80e9f6d1117dff866d55c1c7c028690a"} Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.388368 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c29d66fd429029d1665b5d4e9642b97f80e9f6d1117dff866d55c1c7c028690a" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.405164 4841 scope.go:117] "RemoveContainer" containerID="ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.417014 4841 scope.go:117] "RemoveContainer" containerID="6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.429421 4841 scope.go:117] "RemoveContainer" containerID="ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.443843 4841 scope.go:117] "RemoveContainer" containerID="5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.467422 4841 scope.go:117] "RemoveContainer" containerID="48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.490432 4841 scope.go:117] "RemoveContainer" containerID="258c37d4d3a4e8eb1c14a049165aca26c6a6883ad7a7b3c83f01439c9035baf4" Mar 13 09:17:32 crc kubenswrapper[4841]: E0313 09:17:32.490813 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"258c37d4d3a4e8eb1c14a049165aca26c6a6883ad7a7b3c83f01439c9035baf4\": container with ID starting with 258c37d4d3a4e8eb1c14a049165aca26c6a6883ad7a7b3c83f01439c9035baf4 not found: ID does not exist" containerID="258c37d4d3a4e8eb1c14a049165aca26c6a6883ad7a7b3c83f01439c9035baf4" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.490895 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"258c37d4d3a4e8eb1c14a049165aca26c6a6883ad7a7b3c83f01439c9035baf4"} err="failed to get container status \"258c37d4d3a4e8eb1c14a049165aca26c6a6883ad7a7b3c83f01439c9035baf4\": rpc error: code = NotFound desc = could not find container \"258c37d4d3a4e8eb1c14a049165aca26c6a6883ad7a7b3c83f01439c9035baf4\": container with ID starting with 258c37d4d3a4e8eb1c14a049165aca26c6a6883ad7a7b3c83f01439c9035baf4 not found: ID does not exist" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.490931 4841 scope.go:117] "RemoveContainer" containerID="ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc" Mar 13 09:17:32 crc kubenswrapper[4841]: E0313 09:17:32.491299 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\": container with ID starting with ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc not found: ID does not exist" containerID="ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.491331 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc"} err="failed to get container status \"ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\": rpc error: code = NotFound desc = could not find container \"ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc\": container with ID starting with ab0354c193ddb7aba7fcab0ffdd83a63f71bed06bfa493d3db9ac7e83f24cebc not found: ID does not exist" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.491347 4841 scope.go:117] "RemoveContainer" containerID="6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be" Mar 13 09:17:32 crc kubenswrapper[4841]: E0313 09:17:32.491573 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\": container with ID starting with 6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be not found: ID does not exist" containerID="6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.491620 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be"} err="failed to get container status \"6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\": rpc error: code = NotFound desc = could not find container \"6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be\": container with ID starting with 6f8c8787eb09e2b019695df9f70b6d6f05dc5ce6968bae165196fa7c3df6f6be not found: ID does not exist" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.491637 4841 scope.go:117] "RemoveContainer" containerID="ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c" Mar 13 09:17:32 crc kubenswrapper[4841]: E0313 09:17:32.491886 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\": container with ID starting with ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c not found: ID does not exist" containerID="ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.491981 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c"} err="failed to get container status \"ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\": rpc error: code = NotFound desc = could not find container \"ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c\": container with ID starting with ff3715287c706ffe80d91c32d5d6d9d566d36e31486b71f24f98e59a5d07845c not found: ID does not exist" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.492002 4841 scope.go:117] "RemoveContainer" containerID="5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f" Mar 13 09:17:32 crc kubenswrapper[4841]: E0313 09:17:32.492595 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\": container with ID starting with 5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f not found: ID does not exist" containerID="5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.492624 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f"} err="failed to get container status \"5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\": rpc error: code = NotFound desc = could not find container \"5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f\": container with ID starting with 5ebddff1cbb659283d6e0f7417ad3dce2f68746975b670db84424eb69cc3221f not found: ID does not exist" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.492643 4841 scope.go:117] "RemoveContainer" containerID="48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c" Mar 13 09:17:32 crc kubenswrapper[4841]: E0313 09:17:32.492822 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\": container with ID starting with 48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c not found: ID does not exist" containerID="48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c" Mar 13 09:17:32 crc kubenswrapper[4841]: I0313 09:17:32.492843 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c"} err="failed to get container status \"48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\": rpc error: code = NotFound desc = could not find container \"48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c\": container with ID starting with 48399d6d62e008992fe4f376ec3d5e0b6e8694aafe8cb79453fe821bc8bc590c not found: ID does not exist" Mar 13 09:17:33 crc kubenswrapper[4841]: E0313 09:17:33.508713 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="3.2s" Mar 13 09:17:34 crc kubenswrapper[4841]: I0313 09:17:34.357032 4841 status_manager.go:851] "Failed to get status for pod" podUID="ce87b469-e7d3-497c-890f-9824b49efaca" pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6db76d9f69-c8gf2\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:34 crc kubenswrapper[4841]: I0313 09:17:34.357663 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:34 crc kubenswrapper[4841]: I0313 09:17:34.358670 4841 status_manager.go:851] "Failed to get status for pod" podUID="ab5fd5c2-7ff5-4184-983f-6a47828ccf1a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:34 crc kubenswrapper[4841]: I0313 09:17:34.359437 4841 status_manager.go:851] "Failed to get status for pod" podUID="ce87b469-e7d3-497c-890f-9824b49efaca" pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6db76d9f69-c8gf2\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:35 crc kubenswrapper[4841]: I0313 09:17:35.604113 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" podUID="293f3a0e-401d-440f-8321-0aac18b90219" containerName="oauth-openshift" containerID="cri-o://fdb2201f290d04899f9d0a955db359bc157600c136bb62af54c243cc1c5c322a" gracePeriod=15 Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.156136 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.157414 4841 status_manager.go:851] "Failed to get status for pod" podUID="293f3a0e-401d-440f-8321-0aac18b90219" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-zsbmr\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.157975 4841 status_manager.go:851] "Failed to get status for pod" podUID="ab5fd5c2-7ff5-4184-983f-6a47828ccf1a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.158857 4841 status_manager.go:851] "Failed to get status for pod" podUID="ce87b469-e7d3-497c-890f-9824b49efaca" pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6db76d9f69-c8gf2\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.287376 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/293f3a0e-401d-440f-8321-0aac18b90219-audit-dir\") pod \"293f3a0e-401d-440f-8321-0aac18b90219\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.287455 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-template-provider-selection\") pod \"293f3a0e-401d-440f-8321-0aac18b90219\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.287526 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-idp-0-file-data\") pod \"293f3a0e-401d-440f-8321-0aac18b90219\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.287564 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/293f3a0e-401d-440f-8321-0aac18b90219-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "293f3a0e-401d-440f-8321-0aac18b90219" (UID: "293f3a0e-401d-440f-8321-0aac18b90219"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.287587 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-cliconfig\") pod \"293f3a0e-401d-440f-8321-0aac18b90219\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.287670 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-serving-cert\") pod \"293f3a0e-401d-440f-8321-0aac18b90219\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.287712 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-router-certs\") pod \"293f3a0e-401d-440f-8321-0aac18b90219\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.287742 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkgdc\" (UniqueName: \"kubernetes.io/projected/293f3a0e-401d-440f-8321-0aac18b90219-kube-api-access-fkgdc\") pod \"293f3a0e-401d-440f-8321-0aac18b90219\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.287777 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-template-login\") pod \"293f3a0e-401d-440f-8321-0aac18b90219\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.287810 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-session\") pod \"293f3a0e-401d-440f-8321-0aac18b90219\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.287835 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-service-ca\") pod \"293f3a0e-401d-440f-8321-0aac18b90219\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.287870 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-template-error\") pod \"293f3a0e-401d-440f-8321-0aac18b90219\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.287900 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-audit-policies\") pod \"293f3a0e-401d-440f-8321-0aac18b90219\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.287922 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-ocp-branding-template\") pod \"293f3a0e-401d-440f-8321-0aac18b90219\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.287954 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-trusted-ca-bundle\") pod \"293f3a0e-401d-440f-8321-0aac18b90219\" (UID: \"293f3a0e-401d-440f-8321-0aac18b90219\") " Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.288141 4841 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/293f3a0e-401d-440f-8321-0aac18b90219-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.288725 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "293f3a0e-401d-440f-8321-0aac18b90219" (UID: "293f3a0e-401d-440f-8321-0aac18b90219"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.288772 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "293f3a0e-401d-440f-8321-0aac18b90219" (UID: "293f3a0e-401d-440f-8321-0aac18b90219"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.290259 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "293f3a0e-401d-440f-8321-0aac18b90219" (UID: "293f3a0e-401d-440f-8321-0aac18b90219"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.290777 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "293f3a0e-401d-440f-8321-0aac18b90219" (UID: "293f3a0e-401d-440f-8321-0aac18b90219"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.294979 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "293f3a0e-401d-440f-8321-0aac18b90219" (UID: "293f3a0e-401d-440f-8321-0aac18b90219"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.295489 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "293f3a0e-401d-440f-8321-0aac18b90219" (UID: "293f3a0e-401d-440f-8321-0aac18b90219"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.296153 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/293f3a0e-401d-440f-8321-0aac18b90219-kube-api-access-fkgdc" (OuterVolumeSpecName: "kube-api-access-fkgdc") pod "293f3a0e-401d-440f-8321-0aac18b90219" (UID: "293f3a0e-401d-440f-8321-0aac18b90219"). InnerVolumeSpecName "kube-api-access-fkgdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.296182 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "293f3a0e-401d-440f-8321-0aac18b90219" (UID: "293f3a0e-401d-440f-8321-0aac18b90219"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.296579 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "293f3a0e-401d-440f-8321-0aac18b90219" (UID: "293f3a0e-401d-440f-8321-0aac18b90219"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.296960 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "293f3a0e-401d-440f-8321-0aac18b90219" (UID: "293f3a0e-401d-440f-8321-0aac18b90219"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.300680 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "293f3a0e-401d-440f-8321-0aac18b90219" (UID: "293f3a0e-401d-440f-8321-0aac18b90219"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.300917 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "293f3a0e-401d-440f-8321-0aac18b90219" (UID: "293f3a0e-401d-440f-8321-0aac18b90219"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.301660 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "293f3a0e-401d-440f-8321-0aac18b90219" (UID: "293f3a0e-401d-440f-8321-0aac18b90219"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.388808 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.388846 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.388860 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.388874 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.388886 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.388898 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.388909 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkgdc\" (UniqueName: \"kubernetes.io/projected/293f3a0e-401d-440f-8321-0aac18b90219-kube-api-access-fkgdc\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.388922 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.388934 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.388946 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.388957 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.388969 4841 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/293f3a0e-401d-440f-8321-0aac18b90219-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.388979 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/293f3a0e-401d-440f-8321-0aac18b90219-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.423061 4841 generic.go:334] "Generic (PLEG): container finished" podID="293f3a0e-401d-440f-8321-0aac18b90219" containerID="fdb2201f290d04899f9d0a955db359bc157600c136bb62af54c243cc1c5c322a" exitCode=0 Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.423148 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" event={"ID":"293f3a0e-401d-440f-8321-0aac18b90219","Type":"ContainerDied","Data":"fdb2201f290d04899f9d0a955db359bc157600c136bb62af54c243cc1c5c322a"} Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.423201 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" event={"ID":"293f3a0e-401d-440f-8321-0aac18b90219","Type":"ContainerDied","Data":"acecc330912e15e4e243341fce1c2d6655c36e411cf8d0c20bc6294a07ce6518"} Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.423238 4841 scope.go:117] "RemoveContainer" containerID="fdb2201f290d04899f9d0a955db359bc157600c136bb62af54c243cc1c5c322a" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.423377 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.424438 4841 status_manager.go:851] "Failed to get status for pod" podUID="293f3a0e-401d-440f-8321-0aac18b90219" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-zsbmr\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.425285 4841 status_manager.go:851] "Failed to get status for pod" podUID="ab5fd5c2-7ff5-4184-983f-6a47828ccf1a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.425844 4841 status_manager.go:851] "Failed to get status for pod" podUID="ce87b469-e7d3-497c-890f-9824b49efaca" pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6db76d9f69-c8gf2\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.443359 4841 status_manager.go:851] "Failed to get status for pod" podUID="293f3a0e-401d-440f-8321-0aac18b90219" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-zsbmr\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.444139 4841 status_manager.go:851] "Failed to get status for pod" podUID="ab5fd5c2-7ff5-4184-983f-6a47828ccf1a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.444619 4841 status_manager.go:851] "Failed to get status for pod" podUID="ce87b469-e7d3-497c-890f-9824b49efaca" pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6db76d9f69-c8gf2\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.452521 4841 scope.go:117] "RemoveContainer" containerID="fdb2201f290d04899f9d0a955db359bc157600c136bb62af54c243cc1c5c322a" Mar 13 09:17:36 crc kubenswrapper[4841]: E0313 09:17:36.453041 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdb2201f290d04899f9d0a955db359bc157600c136bb62af54c243cc1c5c322a\": container with ID starting with fdb2201f290d04899f9d0a955db359bc157600c136bb62af54c243cc1c5c322a not found: ID does not exist" containerID="fdb2201f290d04899f9d0a955db359bc157600c136bb62af54c243cc1c5c322a" Mar 13 09:17:36 crc kubenswrapper[4841]: I0313 09:17:36.453100 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb2201f290d04899f9d0a955db359bc157600c136bb62af54c243cc1c5c322a"} err="failed to get container status \"fdb2201f290d04899f9d0a955db359bc157600c136bb62af54c243cc1c5c322a\": rpc error: code = NotFound desc = could not find container \"fdb2201f290d04899f9d0a955db359bc157600c136bb62af54c243cc1c5c322a\": container with ID starting with fdb2201f290d04899f9d0a955db359bc157600c136bb62af54c243cc1c5c322a not found: ID does not exist" Mar 13 09:17:36 crc kubenswrapper[4841]: E0313 09:17:36.711328 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="6.4s" Mar 13 09:17:38 crc kubenswrapper[4841]: I0313 09:17:37.999880 4841 status_manager.go:851] "Failed to get status for pod" podUID="293f3a0e-401d-440f-8321-0aac18b90219" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-zsbmr\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:38 crc kubenswrapper[4841]: I0313 09:17:38.000443 4841 status_manager.go:851] "Failed to get status for pod" podUID="ab5fd5c2-7ff5-4184-983f-6a47828ccf1a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:38 crc kubenswrapper[4841]: I0313 09:17:38.001506 4841 status_manager.go:851] "Failed to get status for pod" podUID="ce87b469-e7d3-497c-890f-9824b49efaca" pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6db76d9f69-c8gf2\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:40 crc kubenswrapper[4841]: I0313 09:17:40.994367 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:40 crc kubenswrapper[4841]: I0313 09:17:40.995658 4841 status_manager.go:851] "Failed to get status for pod" podUID="293f3a0e-401d-440f-8321-0aac18b90219" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-zsbmr\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:40 crc kubenswrapper[4841]: I0313 09:17:40.996188 4841 status_manager.go:851] "Failed to get status for pod" podUID="ab5fd5c2-7ff5-4184-983f-6a47828ccf1a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:40 crc kubenswrapper[4841]: I0313 09:17:40.996748 4841 status_manager.go:851] "Failed to get status for pod" podUID="ce87b469-e7d3-497c-890f-9824b49efaca" pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6db76d9f69-c8gf2\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:41 crc kubenswrapper[4841]: I0313 09:17:41.018223 4841 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="53f667de-6ffe-4f8a-b98f-944f055847c5" Mar 13 09:17:41 crc kubenswrapper[4841]: I0313 09:17:41.018299 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="53f667de-6ffe-4f8a-b98f-944f055847c5" Mar 13 09:17:41 crc kubenswrapper[4841]: E0313 09:17:41.019000 4841 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:41 crc kubenswrapper[4841]: I0313 09:17:41.019779 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:41 crc kubenswrapper[4841]: W0313 09:17:41.048579 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-f464b79eda8a3ef057c21abe13d68760edb27114da703fc00c62fc608224c010 WatchSource:0}: Error finding container f464b79eda8a3ef057c21abe13d68760edb27114da703fc00c62fc608224c010: Status 404 returned error can't find the container with id f464b79eda8a3ef057c21abe13d68760edb27114da703fc00c62fc608224c010 Mar 13 09:17:41 crc kubenswrapper[4841]: I0313 09:17:41.467652 4841 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="56e5512779c6f36fdbe7771f5f836ad35d484dfbe6b2a4ae946f17c88e1a27ec" exitCode=0 Mar 13 09:17:41 crc kubenswrapper[4841]: I0313 09:17:41.467723 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"56e5512779c6f36fdbe7771f5f836ad35d484dfbe6b2a4ae946f17c88e1a27ec"} Mar 13 09:17:41 crc kubenswrapper[4841]: I0313 09:17:41.468146 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f464b79eda8a3ef057c21abe13d68760edb27114da703fc00c62fc608224c010"} Mar 13 09:17:41 crc kubenswrapper[4841]: I0313 09:17:41.468578 4841 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="53f667de-6ffe-4f8a-b98f-944f055847c5" Mar 13 09:17:41 crc kubenswrapper[4841]: I0313 09:17:41.468610 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="53f667de-6ffe-4f8a-b98f-944f055847c5" Mar 13 09:17:41 crc kubenswrapper[4841]: E0313 09:17:41.469051 4841 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:41 crc kubenswrapper[4841]: I0313 09:17:41.469062 4841 status_manager.go:851] "Failed to get status for pod" podUID="293f3a0e-401d-440f-8321-0aac18b90219" pod="openshift-authentication/oauth-openshift-558db77b4-zsbmr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-zsbmr\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:41 crc kubenswrapper[4841]: I0313 09:17:41.469589 4841 status_manager.go:851] "Failed to get status for pod" podUID="ab5fd5c2-7ff5-4184-983f-6a47828ccf1a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:41 crc kubenswrapper[4841]: I0313 09:17:41.470060 4841 status_manager.go:851] "Failed to get status for pod" podUID="ce87b469-e7d3-497c-890f-9824b49efaca" pod="openshift-controller-manager/controller-manager-6db76d9f69-c8gf2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6db76d9f69-c8gf2\": dial tcp 38.102.83.106:6443: connect: connection refused" Mar 13 09:17:42 crc kubenswrapper[4841]: I0313 09:17:42.480834 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9dbf1f74408df98f88823fef6112e0f1126de91f48ba2d6daea87871dae068a2"} Mar 13 09:17:42 crc kubenswrapper[4841]: I0313 09:17:42.482255 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"af29d3e071dad9e023ac818dbb8e3e9db878820fa3eb8ea018effa9abfdfd0cb"} Mar 13 09:17:42 crc kubenswrapper[4841]: I0313 09:17:42.482445 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"509aca95c2cb37c2c708d303d57ae93e12e3034c07c719a7f8ac1e5bc0a86d32"} Mar 13 09:17:43 crc kubenswrapper[4841]: I0313 09:17:43.492215 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"17e1e77ca2e704d84c7bb8a1aa0e5a9f87c59c3bea695e1a7b72fb9dc381ae0d"} Mar 13 09:17:43 crc kubenswrapper[4841]: I0313 09:17:43.492304 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"387c99f84c6d59788b99af25c20be420b41fcca098481f49f40a426991b87492"} Mar 13 09:17:43 crc kubenswrapper[4841]: I0313 09:17:43.492335 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:43 crc kubenswrapper[4841]: I0313 09:17:43.492405 4841 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="53f667de-6ffe-4f8a-b98f-944f055847c5" Mar 13 09:17:43 crc kubenswrapper[4841]: I0313 09:17:43.492430 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="53f667de-6ffe-4f8a-b98f-944f055847c5" Mar 13 09:17:43 crc kubenswrapper[4841]: I0313 09:17:43.494614 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 09:17:43 crc kubenswrapper[4841]: I0313 09:17:43.495974 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 09:17:43 crc kubenswrapper[4841]: I0313 09:17:43.496018 4841 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="835d958e1267f01e98aac57897a03c29b574dc64de5080416c896cd2f52430be" exitCode=1 Mar 13 09:17:43 crc kubenswrapper[4841]: I0313 09:17:43.496046 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"835d958e1267f01e98aac57897a03c29b574dc64de5080416c896cd2f52430be"} Mar 13 09:17:43 crc kubenswrapper[4841]: I0313 09:17:43.496472 4841 scope.go:117] "RemoveContainer" containerID="835d958e1267f01e98aac57897a03c29b574dc64de5080416c896cd2f52430be" Mar 13 09:17:43 crc kubenswrapper[4841]: I0313 09:17:43.989160 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:17:44 crc kubenswrapper[4841]: I0313 09:17:44.504378 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 09:17:44 crc kubenswrapper[4841]: I0313 09:17:44.505409 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 09:17:44 crc kubenswrapper[4841]: I0313 09:17:44.505458 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"40ede05a26dcd96a59391601d58a93ba790b54382586ef1169b7c984ed754b36"} Mar 13 09:17:46 crc kubenswrapper[4841]: I0313 09:17:46.020732 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:46 crc kubenswrapper[4841]: I0313 09:17:46.020818 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:46 crc kubenswrapper[4841]: I0313 09:17:46.026922 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:47 crc kubenswrapper[4841]: I0313 09:17:47.403475 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:17:48 crc kubenswrapper[4841]: I0313 09:17:48.504674 4841 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:48 crc kubenswrapper[4841]: I0313 09:17:48.530433 4841 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="53f667de-6ffe-4f8a-b98f-944f055847c5" Mar 13 09:17:48 crc kubenswrapper[4841]: I0313 09:17:48.530469 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="53f667de-6ffe-4f8a-b98f-944f055847c5" Mar 13 09:17:48 crc kubenswrapper[4841]: I0313 09:17:48.550980 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:17:48 crc kubenswrapper[4841]: I0313 09:17:48.594471 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="db134d7e-5891-4028-8cd6-e4d219ba9b28" Mar 13 09:17:49 crc kubenswrapper[4841]: I0313 09:17:49.538032 4841 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="53f667de-6ffe-4f8a-b98f-944f055847c5" Mar 13 09:17:49 crc kubenswrapper[4841]: I0313 09:17:49.538070 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="53f667de-6ffe-4f8a-b98f-944f055847c5" Mar 13 09:17:49 crc kubenswrapper[4841]: I0313 09:17:49.541521 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="db134d7e-5891-4028-8cd6-e4d219ba9b28" Mar 13 09:17:53 crc kubenswrapper[4841]: I0313 09:17:53.988736 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:17:53 crc kubenswrapper[4841]: I0313 09:17:53.988979 4841 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 13 09:17:53 crc kubenswrapper[4841]: I0313 09:17:53.989185 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 13 09:17:55 crc kubenswrapper[4841]: I0313 09:17:55.068847 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 09:17:56 crc kubenswrapper[4841]: I0313 09:17:56.134063 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 09:17:56 crc kubenswrapper[4841]: I0313 09:17:56.300319 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 13 09:17:56 crc kubenswrapper[4841]: I0313 09:17:56.857180 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 09:17:57 crc kubenswrapper[4841]: I0313 09:17:57.214597 4841 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 09:17:57 crc kubenswrapper[4841]: I0313 09:17:57.493076 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 09:17:59 crc kubenswrapper[4841]: I0313 09:17:59.343772 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 13 09:17:59 crc kubenswrapper[4841]: I0313 09:17:59.924879 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 09:18:00 crc kubenswrapper[4841]: I0313 09:18:00.049563 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 13 09:18:00 crc kubenswrapper[4841]: I0313 09:18:00.282090 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 09:18:00 crc kubenswrapper[4841]: I0313 09:18:00.533672 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 09:18:00 crc kubenswrapper[4841]: I0313 09:18:00.553892 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 09:18:01 crc kubenswrapper[4841]: I0313 09:18:01.356387 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 09:18:01 crc kubenswrapper[4841]: I0313 09:18:01.532651 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 09:18:01 crc kubenswrapper[4841]: I0313 09:18:01.678926 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 09:18:01 crc kubenswrapper[4841]: I0313 09:18:01.684549 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 09:18:02 crc kubenswrapper[4841]: I0313 09:18:02.164278 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 09:18:02 crc kubenswrapper[4841]: I0313 09:18:02.209651 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 13 09:18:02 crc kubenswrapper[4841]: I0313 09:18:02.230895 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 09:18:02 crc kubenswrapper[4841]: I0313 09:18:02.309400 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 09:18:02 crc kubenswrapper[4841]: I0313 09:18:02.353405 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 09:18:02 crc kubenswrapper[4841]: I0313 09:18:02.389967 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 13 09:18:02 crc kubenswrapper[4841]: I0313 09:18:02.410889 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 09:18:02 crc kubenswrapper[4841]: I0313 09:18:02.590995 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 09:18:02 crc kubenswrapper[4841]: I0313 09:18:02.611812 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 09:18:02 crc kubenswrapper[4841]: I0313 09:18:02.932766 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 09:18:02 crc kubenswrapper[4841]: I0313 09:18:02.933243 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 09:18:03 crc kubenswrapper[4841]: I0313 09:18:03.201294 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 09:18:03 crc kubenswrapper[4841]: I0313 09:18:03.400688 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.022936 4841 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.022988 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.028580 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.028994 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.028650 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.028846 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.047351 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.192817 4841 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.263525 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.413844 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.556724 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.625533 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.715389 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.722447 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.745132 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.794406 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.811358 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.833420 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.868170 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.880400 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.906224 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.916661 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.922327 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.923970 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 13 09:18:04 crc kubenswrapper[4841]: I0313 09:18:04.943025 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.015794 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.050873 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.097122 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.137899 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.185381 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.251612 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.347502 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.347786 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.351221 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.450481 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.567283 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.725027 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.770495 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.790101 4841 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.794588 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-zsbmr"] Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.794651 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.798593 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.809957 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=17.809941737 podStartE2EDuration="17.809941737s" podCreationTimestamp="2026-03-13 09:17:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:18:05.808450231 +0000 UTC m=+368.538350432" watchObservedRunningTime="2026-03-13 09:18:05.809941737 +0000 UTC m=+368.539841928" Mar 13 09:18:05 crc kubenswrapper[4841]: I0313 09:18:05.889018 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 09:18:06 crc kubenswrapper[4841]: I0313 09:18:06.003170 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 09:18:06 crc kubenswrapper[4841]: I0313 09:18:06.003967 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="293f3a0e-401d-440f-8321-0aac18b90219" path="/var/lib/kubelet/pods/293f3a0e-401d-440f-8321-0aac18b90219/volumes" Mar 13 09:18:06 crc kubenswrapper[4841]: I0313 09:18:06.098737 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 09:18:06 crc kubenswrapper[4841]: I0313 09:18:06.124402 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 09:18:06 crc kubenswrapper[4841]: I0313 09:18:06.126663 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 09:18:06 crc kubenswrapper[4841]: I0313 09:18:06.158381 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 09:18:06 crc kubenswrapper[4841]: I0313 09:18:06.223973 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 09:18:06 crc kubenswrapper[4841]: I0313 09:18:06.298373 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 09:18:06 crc kubenswrapper[4841]: I0313 09:18:06.437954 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 09:18:06 crc kubenswrapper[4841]: I0313 09:18:06.554216 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 09:18:06 crc kubenswrapper[4841]: I0313 09:18:06.771097 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 09:18:06 crc kubenswrapper[4841]: I0313 09:18:06.788324 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 09:18:06 crc kubenswrapper[4841]: I0313 09:18:06.844010 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 09:18:06 crc kubenswrapper[4841]: I0313 09:18:06.856714 4841 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 09:18:06 crc kubenswrapper[4841]: I0313 09:18:06.872228 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.012980 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.049830 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.116144 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.145214 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.172753 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.276273 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.305491 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.358325 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.403078 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.405039 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.432505 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.457351 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.503436 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.511214 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.665236 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.708526 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.753166 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.768098 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.806953 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.834581 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.962579 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 09:18:07 crc kubenswrapper[4841]: I0313 09:18:07.989979 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.035462 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.112132 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.223019 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.240882 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.396197 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.407764 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.418234 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.421086 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.430704 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.430770 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.453834 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.592167 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.623562 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.633901 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.662156 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.664429 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.664902 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.752351 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.800896 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.801480 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.886135 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.928155 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.937542 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 09:18:08 crc kubenswrapper[4841]: I0313 09:18:08.997902 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.095252 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.110881 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.140065 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.181023 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.195084 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.273233 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.299452 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.342472 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.353579 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.399665 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.453771 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.492215 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.510556 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.532197 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.539724 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.554194 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.649634 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.773099 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.804733 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.853053 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.907122 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.919898 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.929207 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.954058 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 09:18:09 crc kubenswrapper[4841]: I0313 09:18:09.979207 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.028045 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.133334 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.170596 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.178186 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-db548d47c-b7pt6"] Mar 13 09:18:10 crc kubenswrapper[4841]: E0313 09:18:10.178585 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="293f3a0e-401d-440f-8321-0aac18b90219" containerName="oauth-openshift" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.178615 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="293f3a0e-401d-440f-8321-0aac18b90219" containerName="oauth-openshift" Mar 13 09:18:10 crc kubenswrapper[4841]: E0313 09:18:10.178647 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab5fd5c2-7ff5-4184-983f-6a47828ccf1a" containerName="installer" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.178660 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab5fd5c2-7ff5-4184-983f-6a47828ccf1a" containerName="installer" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.178824 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab5fd5c2-7ff5-4184-983f-6a47828ccf1a" containerName="installer" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.178875 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="293f3a0e-401d-440f-8321-0aac18b90219" containerName="oauth-openshift" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.179462 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.190446 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.190494 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.190518 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.193981 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.194065 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.193984 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.199932 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.199997 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.200018 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.199948 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.200127 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.201083 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.206584 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-db548d47c-b7pt6"] Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.215815 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.231918 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.255141 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.258002 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.284965 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.303602 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.303648 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4a8e3332-e590-43b1-ac92-e37861f0738a-audit-policies\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.303671 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-user-template-error\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.303689 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.303708 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-router-certs\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.303989 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-service-ca\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.304023 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-user-template-login\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.304059 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-session\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.304097 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.304225 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.304314 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.304370 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a8e3332-e590-43b1-ac92-e37861f0738a-audit-dir\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.304453 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.304508 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk5tt\" (UniqueName: \"kubernetes.io/projected/4a8e3332-e590-43b1-ac92-e37861f0738a-kube-api-access-wk5tt\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.333560 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.391819 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.405686 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-service-ca\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.405764 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-user-template-login\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.405821 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-session\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.405881 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.405958 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.406004 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.406055 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a8e3332-e590-43b1-ac92-e37861f0738a-audit-dir\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.406127 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.406172 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk5tt\" (UniqueName: \"kubernetes.io/projected/4a8e3332-e590-43b1-ac92-e37861f0738a-kube-api-access-wk5tt\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.406291 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.406348 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4a8e3332-e590-43b1-ac92-e37861f0738a-audit-policies\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.406395 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-user-template-error\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.406440 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.406481 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-service-ca\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.406493 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-router-certs\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.407093 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.407334 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a8e3332-e590-43b1-ac92-e37861f0738a-audit-dir\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.407819 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.407909 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4a8e3332-e590-43b1-ac92-e37861f0738a-audit-policies\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.411758 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.411825 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-session\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.412065 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-user-template-error\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.412192 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-router-certs\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.412798 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-user-template-login\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.413627 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.413745 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.416006 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4a8e3332-e590-43b1-ac92-e37861f0738a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.425877 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk5tt\" (UniqueName: \"kubernetes.io/projected/4a8e3332-e590-43b1-ac92-e37861f0738a-kube-api-access-wk5tt\") pod \"oauth-openshift-db548d47c-b7pt6\" (UID: \"4a8e3332-e590-43b1-ac92-e37861f0738a\") " pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.491657 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.503963 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.516798 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.687677 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.723996 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.756582 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.757510 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.816651 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.829295 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.916717 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.925943 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.953475 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.971215 4841 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 09:18:10 crc kubenswrapper[4841]: I0313 09:18:10.971693 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://79b8a5c311cd6792650b6ceffeb98e86d08bb151dd93d911aa1621f4eedc6b5b" gracePeriod=5 Mar 13 09:18:11 crc kubenswrapper[4841]: I0313 09:18:11.104053 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 09:18:11 crc kubenswrapper[4841]: I0313 09:18:11.113024 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 13 09:18:11 crc kubenswrapper[4841]: I0313 09:18:11.128609 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 09:18:11 crc kubenswrapper[4841]: I0313 09:18:11.133161 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 09:18:11 crc kubenswrapper[4841]: I0313 09:18:11.205874 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 09:18:11 crc kubenswrapper[4841]: I0313 09:18:11.290944 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 09:18:11 crc kubenswrapper[4841]: I0313 09:18:11.304731 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 13 09:18:11 crc kubenswrapper[4841]: I0313 09:18:11.370950 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 13 09:18:11 crc kubenswrapper[4841]: I0313 09:18:11.422657 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 13 09:18:11 crc kubenswrapper[4841]: I0313 09:18:11.450873 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 09:18:11 crc kubenswrapper[4841]: I0313 09:18:11.480945 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 09:18:11 crc kubenswrapper[4841]: I0313 09:18:11.602592 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 09:18:11 crc kubenswrapper[4841]: I0313 09:18:11.628994 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 09:18:11 crc kubenswrapper[4841]: I0313 09:18:11.687443 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 09:18:11 crc kubenswrapper[4841]: I0313 09:18:11.939173 4841 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 13 09:18:12 crc kubenswrapper[4841]: I0313 09:18:12.010819 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 09:18:12 crc kubenswrapper[4841]: I0313 09:18:12.010934 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 09:18:12 crc kubenswrapper[4841]: I0313 09:18:12.218979 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 09:18:12 crc kubenswrapper[4841]: I0313 09:18:12.234810 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 09:18:12 crc kubenswrapper[4841]: I0313 09:18:12.283801 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 13 09:18:12 crc kubenswrapper[4841]: I0313 09:18:12.390397 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 13 09:18:12 crc kubenswrapper[4841]: I0313 09:18:12.408450 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 09:18:12 crc kubenswrapper[4841]: I0313 09:18:12.447198 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 09:18:12 crc kubenswrapper[4841]: I0313 09:18:12.525333 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 09:18:12 crc kubenswrapper[4841]: I0313 09:18:12.784716 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 13 09:18:12 crc kubenswrapper[4841]: I0313 09:18:12.827191 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 09:18:12 crc kubenswrapper[4841]: I0313 09:18:12.921332 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-db548d47c-b7pt6"] Mar 13 09:18:13 crc kubenswrapper[4841]: I0313 09:18:13.034660 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 09:18:13 crc kubenswrapper[4841]: I0313 09:18:13.086230 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 09:18:13 crc kubenswrapper[4841]: I0313 09:18:13.094524 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" event={"ID":"4a8e3332-e590-43b1-ac92-e37861f0738a","Type":"ContainerStarted","Data":"e6981cca23d8025e108a5ae2d06c87afcc71a0458a935bccd4e4c6f8189c4bae"} Mar 13 09:18:13 crc kubenswrapper[4841]: I0313 09:18:13.149859 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 09:18:13 crc kubenswrapper[4841]: I0313 09:18:13.170284 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 13 09:18:13 crc kubenswrapper[4841]: I0313 09:18:13.213520 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 09:18:13 crc kubenswrapper[4841]: I0313 09:18:13.351505 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 13 09:18:13 crc kubenswrapper[4841]: I0313 09:18:13.360441 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 09:18:13 crc kubenswrapper[4841]: I0313 09:18:13.680805 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 09:18:13 crc kubenswrapper[4841]: I0313 09:18:13.704963 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 13 09:18:13 crc kubenswrapper[4841]: I0313 09:18:13.774433 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 13 09:18:13 crc kubenswrapper[4841]: I0313 09:18:13.985543 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 09:18:13 crc kubenswrapper[4841]: I0313 09:18:13.989294 4841 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 13 09:18:13 crc kubenswrapper[4841]: I0313 09:18:13.989388 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 13 09:18:13 crc kubenswrapper[4841]: I0313 09:18:13.989466 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:18:13 crc kubenswrapper[4841]: I0313 09:18:13.990361 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"40ede05a26dcd96a59391601d58a93ba790b54382586ef1169b7c984ed754b36"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 13 09:18:13 crc kubenswrapper[4841]: I0313 09:18:13.990484 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://40ede05a26dcd96a59391601d58a93ba790b54382586ef1169b7c984ed754b36" gracePeriod=30 Mar 13 09:18:14 crc kubenswrapper[4841]: I0313 09:18:14.104554 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" event={"ID":"4a8e3332-e590-43b1-ac92-e37861f0738a","Type":"ContainerStarted","Data":"28c1a21c8fc47d3adb3c503da5e03998b4ed3c860636601dc2dc9a06300ea026"} Mar 13 09:18:14 crc kubenswrapper[4841]: I0313 09:18:14.104930 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:14 crc kubenswrapper[4841]: I0313 09:18:14.121921 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" Mar 13 09:18:14 crc kubenswrapper[4841]: I0313 09:18:14.126140 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-db548d47c-b7pt6" podStartSLOduration=64.126125798 podStartE2EDuration="1m4.126125798s" podCreationTimestamp="2026-03-13 09:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:18:14.124252141 +0000 UTC m=+376.854152342" watchObservedRunningTime="2026-03-13 09:18:14.126125798 +0000 UTC m=+376.856025989" Mar 13 09:18:14 crc kubenswrapper[4841]: I0313 09:18:14.128204 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 13 09:18:14 crc kubenswrapper[4841]: I0313 09:18:14.129312 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 09:18:14 crc kubenswrapper[4841]: I0313 09:18:14.150336 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 13 09:18:14 crc kubenswrapper[4841]: I0313 09:18:14.186705 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 09:18:14 crc kubenswrapper[4841]: I0313 09:18:14.193682 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 09:18:14 crc kubenswrapper[4841]: I0313 09:18:14.318435 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 09:18:14 crc kubenswrapper[4841]: I0313 09:18:14.319283 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 09:18:14 crc kubenswrapper[4841]: I0313 09:18:14.556071 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 09:18:14 crc kubenswrapper[4841]: I0313 09:18:14.693001 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 09:18:14 crc kubenswrapper[4841]: I0313 09:18:14.897593 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 09:18:14 crc kubenswrapper[4841]: I0313 09:18:14.941985 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 09:18:14 crc kubenswrapper[4841]: I0313 09:18:14.984756 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 09:18:15 crc kubenswrapper[4841]: I0313 09:18:15.002321 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 09:18:15 crc kubenswrapper[4841]: I0313 09:18:15.210105 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 09:18:15 crc kubenswrapper[4841]: I0313 09:18:15.214227 4841 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 09:18:15 crc kubenswrapper[4841]: I0313 09:18:15.325431 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 09:18:15 crc kubenswrapper[4841]: I0313 09:18:15.429876 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 09:18:15 crc kubenswrapper[4841]: I0313 09:18:15.453863 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 09:18:15 crc kubenswrapper[4841]: I0313 09:18:15.644572 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 13 09:18:15 crc kubenswrapper[4841]: I0313 09:18:15.712781 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 09:18:15 crc kubenswrapper[4841]: I0313 09:18:15.817015 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 09:18:15 crc kubenswrapper[4841]: I0313 09:18:15.827998 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 09:18:15 crc kubenswrapper[4841]: I0313 09:18:15.885749 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 13 09:18:15 crc kubenswrapper[4841]: I0313 09:18:15.967695 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.017327 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.092476 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.092566 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.121374 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.121421 4841 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="79b8a5c311cd6792650b6ceffeb98e86d08bb151dd93d911aa1621f4eedc6b5b" exitCode=137 Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.121537 4841 scope.go:117] "RemoveContainer" containerID="79b8a5c311cd6792650b6ceffeb98e86d08bb151dd93d911aa1621f4eedc6b5b" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.121551 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.141558 4841 scope.go:117] "RemoveContainer" containerID="79b8a5c311cd6792650b6ceffeb98e86d08bb151dd93d911aa1621f4eedc6b5b" Mar 13 09:18:16 crc kubenswrapper[4841]: E0313 09:18:16.142029 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79b8a5c311cd6792650b6ceffeb98e86d08bb151dd93d911aa1621f4eedc6b5b\": container with ID starting with 79b8a5c311cd6792650b6ceffeb98e86d08bb151dd93d911aa1621f4eedc6b5b not found: ID does not exist" containerID="79b8a5c311cd6792650b6ceffeb98e86d08bb151dd93d911aa1621f4eedc6b5b" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.142081 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b8a5c311cd6792650b6ceffeb98e86d08bb151dd93d911aa1621f4eedc6b5b"} err="failed to get container status \"79b8a5c311cd6792650b6ceffeb98e86d08bb151dd93d911aa1621f4eedc6b5b\": rpc error: code = NotFound desc = could not find container \"79b8a5c311cd6792650b6ceffeb98e86d08bb151dd93d911aa1621f4eedc6b5b\": container with ID starting with 79b8a5c311cd6792650b6ceffeb98e86d08bb151dd93d911aa1621f4eedc6b5b not found: ID does not exist" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.191190 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.191255 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.191343 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.191369 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.191402 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.191466 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.191466 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.191500 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.191524 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.192545 4841 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.192660 4841 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.192687 4841 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.192700 4841 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.201353 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.229508 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.294173 4841 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.493154 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 09:18:16 crc kubenswrapper[4841]: I0313 09:18:16.751354 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 09:18:17 crc kubenswrapper[4841]: I0313 09:18:17.276426 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 09:18:18 crc kubenswrapper[4841]: I0313 09:18:18.004982 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 13 09:18:18 crc kubenswrapper[4841]: I0313 09:18:18.032993 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 09:18:44 crc kubenswrapper[4841]: I0313 09:18:44.308290 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 13 09:18:44 crc kubenswrapper[4841]: I0313 09:18:44.310657 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 09:18:44 crc kubenswrapper[4841]: I0313 09:18:44.312365 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 09:18:44 crc kubenswrapper[4841]: I0313 09:18:44.312451 4841 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="40ede05a26dcd96a59391601d58a93ba790b54382586ef1169b7c984ed754b36" exitCode=137 Mar 13 09:18:44 crc kubenswrapper[4841]: I0313 09:18:44.312502 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"40ede05a26dcd96a59391601d58a93ba790b54382586ef1169b7c984ed754b36"} Mar 13 09:18:44 crc kubenswrapper[4841]: I0313 09:18:44.312548 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e0613db8087119c915858dfc8f85c13bc8ecaf3e533b316870b9684beba16361"} Mar 13 09:18:44 crc kubenswrapper[4841]: I0313 09:18:44.312581 4841 scope.go:117] "RemoveContainer" containerID="835d958e1267f01e98aac57897a03c29b574dc64de5080416c896cd2f52430be" Mar 13 09:18:45 crc kubenswrapper[4841]: I0313 09:18:45.319082 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 13 09:18:45 crc kubenswrapper[4841]: I0313 09:18:45.320251 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 09:18:47 crc kubenswrapper[4841]: I0313 09:18:47.404006 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:18:53 crc kubenswrapper[4841]: I0313 09:18:53.989400 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:18:54 crc kubenswrapper[4841]: I0313 09:18:54.004998 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:18:54 crc kubenswrapper[4841]: I0313 09:18:54.378478 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 09:19:02 crc kubenswrapper[4841]: I0313 09:19:02.839583 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556558-4xbwp"] Mar 13 09:19:02 crc kubenswrapper[4841]: E0313 09:19:02.840035 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 09:19:02 crc kubenswrapper[4841]: I0313 09:19:02.840046 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 09:19:02 crc kubenswrapper[4841]: I0313 09:19:02.840135 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 09:19:02 crc kubenswrapper[4841]: I0313 09:19:02.840500 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556558-4xbwp" Mar 13 09:19:02 crc kubenswrapper[4841]: I0313 09:19:02.844841 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:19:02 crc kubenswrapper[4841]: I0313 09:19:02.845398 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:19:02 crc kubenswrapper[4841]: I0313 09:19:02.845585 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:19:02 crc kubenswrapper[4841]: I0313 09:19:02.850025 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556558-4xbwp"] Mar 13 09:19:02 crc kubenswrapper[4841]: I0313 09:19:02.941162 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkwl7\" (UniqueName: \"kubernetes.io/projected/5c0a4d41-6264-4da2-a95c-2d94044862a0-kube-api-access-lkwl7\") pod \"auto-csr-approver-29556558-4xbwp\" (UID: \"5c0a4d41-6264-4da2-a95c-2d94044862a0\") " pod="openshift-infra/auto-csr-approver-29556558-4xbwp" Mar 13 09:19:03 crc kubenswrapper[4841]: I0313 09:19:03.042891 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkwl7\" (UniqueName: \"kubernetes.io/projected/5c0a4d41-6264-4da2-a95c-2d94044862a0-kube-api-access-lkwl7\") pod \"auto-csr-approver-29556558-4xbwp\" (UID: \"5c0a4d41-6264-4da2-a95c-2d94044862a0\") " pod="openshift-infra/auto-csr-approver-29556558-4xbwp" Mar 13 09:19:03 crc kubenswrapper[4841]: I0313 09:19:03.084167 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkwl7\" (UniqueName: \"kubernetes.io/projected/5c0a4d41-6264-4da2-a95c-2d94044862a0-kube-api-access-lkwl7\") pod \"auto-csr-approver-29556558-4xbwp\" (UID: \"5c0a4d41-6264-4da2-a95c-2d94044862a0\") " pod="openshift-infra/auto-csr-approver-29556558-4xbwp" Mar 13 09:19:03 crc kubenswrapper[4841]: I0313 09:19:03.154513 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556558-4xbwp" Mar 13 09:19:03 crc kubenswrapper[4841]: I0313 09:19:03.565568 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556558-4xbwp"] Mar 13 09:19:03 crc kubenswrapper[4841]: W0313 09:19:03.568821 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c0a4d41_6264_4da2_a95c_2d94044862a0.slice/crio-4bd2b94d9e201fd6709fb5c0a22f965b9693316e82e0b03ac5122d9cef9c00d2 WatchSource:0}: Error finding container 4bd2b94d9e201fd6709fb5c0a22f965b9693316e82e0b03ac5122d9cef9c00d2: Status 404 returned error can't find the container with id 4bd2b94d9e201fd6709fb5c0a22f965b9693316e82e0b03ac5122d9cef9c00d2 Mar 13 09:19:04 crc kubenswrapper[4841]: I0313 09:19:04.407736 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:19:04 crc kubenswrapper[4841]: I0313 09:19:04.407808 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:19:04 crc kubenswrapper[4841]: I0313 09:19:04.431289 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556558-4xbwp" event={"ID":"5c0a4d41-6264-4da2-a95c-2d94044862a0","Type":"ContainerStarted","Data":"4bd2b94d9e201fd6709fb5c0a22f965b9693316e82e0b03ac5122d9cef9c00d2"} Mar 13 09:19:05 crc kubenswrapper[4841]: I0313 09:19:05.438404 4841 generic.go:334] "Generic (PLEG): container finished" podID="5c0a4d41-6264-4da2-a95c-2d94044862a0" containerID="6c0aa4158655998115dffb79275b474bfa6a52419a51c1461a8691c2c590ffc0" exitCode=0 Mar 13 09:19:05 crc kubenswrapper[4841]: I0313 09:19:05.438473 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556558-4xbwp" event={"ID":"5c0a4d41-6264-4da2-a95c-2d94044862a0","Type":"ContainerDied","Data":"6c0aa4158655998115dffb79275b474bfa6a52419a51c1461a8691c2c590ffc0"} Mar 13 09:19:06 crc kubenswrapper[4841]: I0313 09:19:06.710128 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556558-4xbwp" Mar 13 09:19:06 crc kubenswrapper[4841]: I0313 09:19:06.788556 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkwl7\" (UniqueName: \"kubernetes.io/projected/5c0a4d41-6264-4da2-a95c-2d94044862a0-kube-api-access-lkwl7\") pod \"5c0a4d41-6264-4da2-a95c-2d94044862a0\" (UID: \"5c0a4d41-6264-4da2-a95c-2d94044862a0\") " Mar 13 09:19:06 crc kubenswrapper[4841]: I0313 09:19:06.793563 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c0a4d41-6264-4da2-a95c-2d94044862a0-kube-api-access-lkwl7" (OuterVolumeSpecName: "kube-api-access-lkwl7") pod "5c0a4d41-6264-4da2-a95c-2d94044862a0" (UID: "5c0a4d41-6264-4da2-a95c-2d94044862a0"). InnerVolumeSpecName "kube-api-access-lkwl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:19:06 crc kubenswrapper[4841]: I0313 09:19:06.890088 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkwl7\" (UniqueName: \"kubernetes.io/projected/5c0a4d41-6264-4da2-a95c-2d94044862a0-kube-api-access-lkwl7\") on node \"crc\" DevicePath \"\"" Mar 13 09:19:07 crc kubenswrapper[4841]: I0313 09:19:07.454892 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556558-4xbwp" Mar 13 09:19:07 crc kubenswrapper[4841]: I0313 09:19:07.455368 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556558-4xbwp" event={"ID":"5c0a4d41-6264-4da2-a95c-2d94044862a0","Type":"ContainerDied","Data":"4bd2b94d9e201fd6709fb5c0a22f965b9693316e82e0b03ac5122d9cef9c00d2"} Mar 13 09:19:07 crc kubenswrapper[4841]: I0313 09:19:07.455421 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bd2b94d9e201fd6709fb5c0a22f965b9693316e82e0b03ac5122d9cef9c00d2" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.471513 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l6qx4"] Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.472453 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l6qx4" podUID="81521243-f7e1-4360-9b12-8047988a69dd" containerName="registry-server" containerID="cri-o://f8b2d47f9b4c37d297302ce64c0f0c86db444f63756040c4f6a870113a679b28" gracePeriod=30 Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.489631 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sbx6r"] Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.489904 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sbx6r" podUID="cbd9953a-618b-4cd2-806b-c01e07c40fc2" containerName="registry-server" containerID="cri-o://7081422dd20bdd7244829e876963f21903d1e9db1589b8ec5a050cbea54cf15f" gracePeriod=30 Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.498077 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6t6zf"] Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.498545 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" podUID="fad88931-0cb1-40fd-b256-f9cd1c93a7e6" containerName="marketplace-operator" containerID="cri-o://ca8d720134276e45f89eb1988d60c9dd00b24ad319c7c2a11b282c50d94c6a0a" gracePeriod=30 Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.507194 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4n6sl"] Mar 13 09:19:27 crc kubenswrapper[4841]: E0313 09:19:27.507466 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0a4d41-6264-4da2-a95c-2d94044862a0" containerName="oc" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.507481 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0a4d41-6264-4da2-a95c-2d94044862a0" containerName="oc" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.507577 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c0a4d41-6264-4da2-a95c-2d94044862a0" containerName="oc" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.507961 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4n6sl" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.513137 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4gtt4"] Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.513472 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4gtt4" podUID="6e22cd83-5a44-4048-a618-4c06f3550ace" containerName="registry-server" containerID="cri-o://f01c0a6f121ad398d9cb4ed9b110cd0db457df5afe4f0bef1db46c6f8b4a1159" gracePeriod=30 Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.520202 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4n6sl"] Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.530783 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w95m9"] Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.532212 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w95m9" podUID="67d2682a-a65f-42e2-875a-b4247bfff054" containerName="registry-server" containerID="cri-o://140aa11126c12015df8cca4639a6bd3d5f246e66664fbec13c6e3988d78ba0ed" gracePeriod=30 Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.594243 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqthg\" (UniqueName: \"kubernetes.io/projected/f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45-kube-api-access-gqthg\") pod \"marketplace-operator-79b997595-4n6sl\" (UID: \"f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45\") " pod="openshift-marketplace/marketplace-operator-79b997595-4n6sl" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.594365 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4n6sl\" (UID: \"f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45\") " pod="openshift-marketplace/marketplace-operator-79b997595-4n6sl" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.594409 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4n6sl\" (UID: \"f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45\") " pod="openshift-marketplace/marketplace-operator-79b997595-4n6sl" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.695289 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqthg\" (UniqueName: \"kubernetes.io/projected/f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45-kube-api-access-gqthg\") pod \"marketplace-operator-79b997595-4n6sl\" (UID: \"f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45\") " pod="openshift-marketplace/marketplace-operator-79b997595-4n6sl" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.695343 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4n6sl\" (UID: \"f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45\") " pod="openshift-marketplace/marketplace-operator-79b997595-4n6sl" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.695372 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4n6sl\" (UID: \"f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45\") " pod="openshift-marketplace/marketplace-operator-79b997595-4n6sl" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.696392 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4n6sl\" (UID: \"f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45\") " pod="openshift-marketplace/marketplace-operator-79b997595-4n6sl" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.701450 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4n6sl\" (UID: \"f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45\") " pod="openshift-marketplace/marketplace-operator-79b997595-4n6sl" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.713694 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqthg\" (UniqueName: \"kubernetes.io/projected/f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45-kube-api-access-gqthg\") pod \"marketplace-operator-79b997595-4n6sl\" (UID: \"f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45\") " pod="openshift-marketplace/marketplace-operator-79b997595-4n6sl" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.831453 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4n6sl" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.939081 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6qx4" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.943757 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sbx6r" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.951918 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.960576 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4gtt4" Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.999080 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jg9k\" (UniqueName: \"kubernetes.io/projected/cbd9953a-618b-4cd2-806b-c01e07c40fc2-kube-api-access-7jg9k\") pod \"cbd9953a-618b-4cd2-806b-c01e07c40fc2\" (UID: \"cbd9953a-618b-4cd2-806b-c01e07c40fc2\") " Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.999161 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fad88931-0cb1-40fd-b256-f9cd1c93a7e6-marketplace-operator-metrics\") pod \"fad88931-0cb1-40fd-b256-f9cd1c93a7e6\" (UID: \"fad88931-0cb1-40fd-b256-f9cd1c93a7e6\") " Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.999195 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81521243-f7e1-4360-9b12-8047988a69dd-catalog-content\") pod \"81521243-f7e1-4360-9b12-8047988a69dd\" (UID: \"81521243-f7e1-4360-9b12-8047988a69dd\") " Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.999232 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs6tj\" (UniqueName: \"kubernetes.io/projected/6e22cd83-5a44-4048-a618-4c06f3550ace-kube-api-access-gs6tj\") pod \"6e22cd83-5a44-4048-a618-4c06f3550ace\" (UID: \"6e22cd83-5a44-4048-a618-4c06f3550ace\") " Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.999281 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g26wp\" (UniqueName: \"kubernetes.io/projected/fad88931-0cb1-40fd-b256-f9cd1c93a7e6-kube-api-access-g26wp\") pod \"fad88931-0cb1-40fd-b256-f9cd1c93a7e6\" (UID: \"fad88931-0cb1-40fd-b256-f9cd1c93a7e6\") " Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.999324 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81521243-f7e1-4360-9b12-8047988a69dd-utilities\") pod \"81521243-f7e1-4360-9b12-8047988a69dd\" (UID: \"81521243-f7e1-4360-9b12-8047988a69dd\") " Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.999349 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms9xf\" (UniqueName: \"kubernetes.io/projected/81521243-f7e1-4360-9b12-8047988a69dd-kube-api-access-ms9xf\") pod \"81521243-f7e1-4360-9b12-8047988a69dd\" (UID: \"81521243-f7e1-4360-9b12-8047988a69dd\") " Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.999369 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e22cd83-5a44-4048-a618-4c06f3550ace-utilities\") pod \"6e22cd83-5a44-4048-a618-4c06f3550ace\" (UID: \"6e22cd83-5a44-4048-a618-4c06f3550ace\") " Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.999397 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fad88931-0cb1-40fd-b256-f9cd1c93a7e6-marketplace-trusted-ca\") pod \"fad88931-0cb1-40fd-b256-f9cd1c93a7e6\" (UID: \"fad88931-0cb1-40fd-b256-f9cd1c93a7e6\") " Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.999429 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbd9953a-618b-4cd2-806b-c01e07c40fc2-utilities\") pod \"cbd9953a-618b-4cd2-806b-c01e07c40fc2\" (UID: \"cbd9953a-618b-4cd2-806b-c01e07c40fc2\") " Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.999451 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbd9953a-618b-4cd2-806b-c01e07c40fc2-catalog-content\") pod \"cbd9953a-618b-4cd2-806b-c01e07c40fc2\" (UID: \"cbd9953a-618b-4cd2-806b-c01e07c40fc2\") " Mar 13 09:19:27 crc kubenswrapper[4841]: I0313 09:19:27.999475 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e22cd83-5a44-4048-a618-4c06f3550ace-catalog-content\") pod \"6e22cd83-5a44-4048-a618-4c06f3550ace\" (UID: \"6e22cd83-5a44-4048-a618-4c06f3550ace\") " Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.005639 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81521243-f7e1-4360-9b12-8047988a69dd-utilities" (OuterVolumeSpecName: "utilities") pod "81521243-f7e1-4360-9b12-8047988a69dd" (UID: "81521243-f7e1-4360-9b12-8047988a69dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.006931 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fad88931-0cb1-40fd-b256-f9cd1c93a7e6-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "fad88931-0cb1-40fd-b256-f9cd1c93a7e6" (UID: "fad88931-0cb1-40fd-b256-f9cd1c93a7e6"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.007868 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbd9953a-618b-4cd2-806b-c01e07c40fc2-utilities" (OuterVolumeSpecName: "utilities") pod "cbd9953a-618b-4cd2-806b-c01e07c40fc2" (UID: "cbd9953a-618b-4cd2-806b-c01e07c40fc2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.008741 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e22cd83-5a44-4048-a618-4c06f3550ace-utilities" (OuterVolumeSpecName: "utilities") pod "6e22cd83-5a44-4048-a618-4c06f3550ace" (UID: "6e22cd83-5a44-4048-a618-4c06f3550ace"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.012238 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e22cd83-5a44-4048-a618-4c06f3550ace-kube-api-access-gs6tj" (OuterVolumeSpecName: "kube-api-access-gs6tj") pod "6e22cd83-5a44-4048-a618-4c06f3550ace" (UID: "6e22cd83-5a44-4048-a618-4c06f3550ace"). InnerVolumeSpecName "kube-api-access-gs6tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.022681 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fad88931-0cb1-40fd-b256-f9cd1c93a7e6-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "fad88931-0cb1-40fd-b256-f9cd1c93a7e6" (UID: "fad88931-0cb1-40fd-b256-f9cd1c93a7e6"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.026130 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fad88931-0cb1-40fd-b256-f9cd1c93a7e6-kube-api-access-g26wp" (OuterVolumeSpecName: "kube-api-access-g26wp") pod "fad88931-0cb1-40fd-b256-f9cd1c93a7e6" (UID: "fad88931-0cb1-40fd-b256-f9cd1c93a7e6"). InnerVolumeSpecName "kube-api-access-g26wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.033063 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w95m9" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.044449 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbd9953a-618b-4cd2-806b-c01e07c40fc2-kube-api-access-7jg9k" (OuterVolumeSpecName: "kube-api-access-7jg9k") pod "cbd9953a-618b-4cd2-806b-c01e07c40fc2" (UID: "cbd9953a-618b-4cd2-806b-c01e07c40fc2"). InnerVolumeSpecName "kube-api-access-7jg9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.051055 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e22cd83-5a44-4048-a618-4c06f3550ace-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e22cd83-5a44-4048-a618-4c06f3550ace" (UID: "6e22cd83-5a44-4048-a618-4c06f3550ace"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.058765 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81521243-f7e1-4360-9b12-8047988a69dd-kube-api-access-ms9xf" (OuterVolumeSpecName: "kube-api-access-ms9xf") pod "81521243-f7e1-4360-9b12-8047988a69dd" (UID: "81521243-f7e1-4360-9b12-8047988a69dd"). InnerVolumeSpecName "kube-api-access-ms9xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.100797 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8r79\" (UniqueName: \"kubernetes.io/projected/67d2682a-a65f-42e2-875a-b4247bfff054-kube-api-access-s8r79\") pod \"67d2682a-a65f-42e2-875a-b4247bfff054\" (UID: \"67d2682a-a65f-42e2-875a-b4247bfff054\") " Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.100834 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67d2682a-a65f-42e2-875a-b4247bfff054-utilities\") pod \"67d2682a-a65f-42e2-875a-b4247bfff054\" (UID: \"67d2682a-a65f-42e2-875a-b4247bfff054\") " Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.100922 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67d2682a-a65f-42e2-875a-b4247bfff054-catalog-content\") pod \"67d2682a-a65f-42e2-875a-b4247bfff054\" (UID: \"67d2682a-a65f-42e2-875a-b4247bfff054\") " Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.101127 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jg9k\" (UniqueName: \"kubernetes.io/projected/cbd9953a-618b-4cd2-806b-c01e07c40fc2-kube-api-access-7jg9k\") on node \"crc\" DevicePath \"\"" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.101139 4841 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fad88931-0cb1-40fd-b256-f9cd1c93a7e6-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.101148 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs6tj\" (UniqueName: \"kubernetes.io/projected/6e22cd83-5a44-4048-a618-4c06f3550ace-kube-api-access-gs6tj\") on node \"crc\" DevicePath \"\"" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.101157 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g26wp\" (UniqueName: \"kubernetes.io/projected/fad88931-0cb1-40fd-b256-f9cd1c93a7e6-kube-api-access-g26wp\") on node \"crc\" DevicePath \"\"" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.101166 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81521243-f7e1-4360-9b12-8047988a69dd-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.101175 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e22cd83-5a44-4048-a618-4c06f3550ace-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.101185 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms9xf\" (UniqueName: \"kubernetes.io/projected/81521243-f7e1-4360-9b12-8047988a69dd-kube-api-access-ms9xf\") on node \"crc\" DevicePath \"\"" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.101195 4841 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fad88931-0cb1-40fd-b256-f9cd1c93a7e6-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.101204 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbd9953a-618b-4cd2-806b-c01e07c40fc2-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.101211 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e22cd83-5a44-4048-a618-4c06f3550ace-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.101766 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67d2682a-a65f-42e2-875a-b4247bfff054-utilities" (OuterVolumeSpecName: "utilities") pod "67d2682a-a65f-42e2-875a-b4247bfff054" (UID: "67d2682a-a65f-42e2-875a-b4247bfff054"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.103300 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d2682a-a65f-42e2-875a-b4247bfff054-kube-api-access-s8r79" (OuterVolumeSpecName: "kube-api-access-s8r79") pod "67d2682a-a65f-42e2-875a-b4247bfff054" (UID: "67d2682a-a65f-42e2-875a-b4247bfff054"). InnerVolumeSpecName "kube-api-access-s8r79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.117056 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81521243-f7e1-4360-9b12-8047988a69dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81521243-f7e1-4360-9b12-8047988a69dd" (UID: "81521243-f7e1-4360-9b12-8047988a69dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.139287 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbd9953a-618b-4cd2-806b-c01e07c40fc2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbd9953a-618b-4cd2-806b-c01e07c40fc2" (UID: "cbd9953a-618b-4cd2-806b-c01e07c40fc2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.201857 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67d2682a-a65f-42e2-875a-b4247bfff054-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.201878 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81521243-f7e1-4360-9b12-8047988a69dd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.201889 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbd9953a-618b-4cd2-806b-c01e07c40fc2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.201898 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8r79\" (UniqueName: \"kubernetes.io/projected/67d2682a-a65f-42e2-875a-b4247bfff054-kube-api-access-s8r79\") on node \"crc\" DevicePath \"\"" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.218045 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67d2682a-a65f-42e2-875a-b4247bfff054-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67d2682a-a65f-42e2-875a-b4247bfff054" (UID: "67d2682a-a65f-42e2-875a-b4247bfff054"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.303464 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67d2682a-a65f-42e2-875a-b4247bfff054-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.339340 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4n6sl"] Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.579227 4841 generic.go:334] "Generic (PLEG): container finished" podID="6e22cd83-5a44-4048-a618-4c06f3550ace" containerID="f01c0a6f121ad398d9cb4ed9b110cd0db457df5afe4f0bef1db46c6f8b4a1159" exitCode=0 Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.579297 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4gtt4" event={"ID":"6e22cd83-5a44-4048-a618-4c06f3550ace","Type":"ContainerDied","Data":"f01c0a6f121ad398d9cb4ed9b110cd0db457df5afe4f0bef1db46c6f8b4a1159"} Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.579614 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4gtt4" event={"ID":"6e22cd83-5a44-4048-a618-4c06f3550ace","Type":"ContainerDied","Data":"44c6a743851b1ecaefa580de453f322e764771723129c27b26136940e93c399b"} Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.579643 4841 scope.go:117] "RemoveContainer" containerID="f01c0a6f121ad398d9cb4ed9b110cd0db457df5afe4f0bef1db46c6f8b4a1159" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.579365 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4gtt4" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.584226 4841 generic.go:334] "Generic (PLEG): container finished" podID="67d2682a-a65f-42e2-875a-b4247bfff054" containerID="140aa11126c12015df8cca4639a6bd3d5f246e66664fbec13c6e3988d78ba0ed" exitCode=0 Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.584313 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w95m9" event={"ID":"67d2682a-a65f-42e2-875a-b4247bfff054","Type":"ContainerDied","Data":"140aa11126c12015df8cca4639a6bd3d5f246e66664fbec13c6e3988d78ba0ed"} Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.584340 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w95m9" event={"ID":"67d2682a-a65f-42e2-875a-b4247bfff054","Type":"ContainerDied","Data":"11acf8002510b51831b1226d859b327a6f9e09fc2ece211904045a6ef45c315d"} Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.584420 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w95m9" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.588896 4841 generic.go:334] "Generic (PLEG): container finished" podID="fad88931-0cb1-40fd-b256-f9cd1c93a7e6" containerID="ca8d720134276e45f89eb1988d60c9dd00b24ad319c7c2a11b282c50d94c6a0a" exitCode=0 Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.588941 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" event={"ID":"fad88931-0cb1-40fd-b256-f9cd1c93a7e6","Type":"ContainerDied","Data":"ca8d720134276e45f89eb1988d60c9dd00b24ad319c7c2a11b282c50d94c6a0a"} Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.588987 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" event={"ID":"fad88931-0cb1-40fd-b256-f9cd1c93a7e6","Type":"ContainerDied","Data":"31b074416c805d3a73651dad8947d6445d7cfff8604bc59e7f1ada047e6df69e"} Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.588953 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6t6zf" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.590818 4841 generic.go:334] "Generic (PLEG): container finished" podID="cbd9953a-618b-4cd2-806b-c01e07c40fc2" containerID="7081422dd20bdd7244829e876963f21903d1e9db1589b8ec5a050cbea54cf15f" exitCode=0 Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.590873 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sbx6r" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.590900 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbx6r" event={"ID":"cbd9953a-618b-4cd2-806b-c01e07c40fc2","Type":"ContainerDied","Data":"7081422dd20bdd7244829e876963f21903d1e9db1589b8ec5a050cbea54cf15f"} Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.590916 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbx6r" event={"ID":"cbd9953a-618b-4cd2-806b-c01e07c40fc2","Type":"ContainerDied","Data":"0e99cd7f3765101b92e057d0b1c37498848898e366aa210e26e4c66ca12cd683"} Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.595505 4841 scope.go:117] "RemoveContainer" containerID="874212a9cf144263de3415092f0d925e0a8f4275de354040703271034e73b068" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.597783 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4n6sl" event={"ID":"f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45","Type":"ContainerStarted","Data":"38b35096ee537db50abb4dc20829214ec9879b83d7c8422f5f142d85e5c41be9"} Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.597816 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4n6sl" event={"ID":"f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45","Type":"ContainerStarted","Data":"7f05e879cfb91121d615855595b72043cdbe28903ad5ec424c8673d5e1511786"} Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.598611 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4n6sl" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.600183 4841 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4n6sl container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" start-of-body= Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.600224 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4n6sl" podUID="f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.621008 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6qx4" event={"ID":"81521243-f7e1-4360-9b12-8047988a69dd","Type":"ContainerDied","Data":"f8b2d47f9b4c37d297302ce64c0f0c86db444f63756040c4f6a870113a679b28"} Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.620915 4841 generic.go:334] "Generic (PLEG): container finished" podID="81521243-f7e1-4360-9b12-8047988a69dd" containerID="f8b2d47f9b4c37d297302ce64c0f0c86db444f63756040c4f6a870113a679b28" exitCode=0 Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.621071 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6qx4" event={"ID":"81521243-f7e1-4360-9b12-8047988a69dd","Type":"ContainerDied","Data":"4623e2663d41561119cbb5d6845674d032df2a7ae7abc324b61b0b46cc3a946b"} Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.621135 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6qx4" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.626099 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4n6sl" podStartSLOduration=1.626081305 podStartE2EDuration="1.626081305s" podCreationTimestamp="2026-03-13 09:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:19:28.616984518 +0000 UTC m=+451.346884729" watchObservedRunningTime="2026-03-13 09:19:28.626081305 +0000 UTC m=+451.355981496" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.630512 4841 scope.go:117] "RemoveContainer" containerID="2da51e28574ea04f227eafd23757b04738a3d44b6b2b97f090a3f0044b43f499" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.632238 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4gtt4"] Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.638717 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4gtt4"] Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.660433 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w95m9"] Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.663664 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w95m9"] Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.686646 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6t6zf"] Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.686676 4841 scope.go:117] "RemoveContainer" containerID="f01c0a6f121ad398d9cb4ed9b110cd0db457df5afe4f0bef1db46c6f8b4a1159" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.686689 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6t6zf"] Mar 13 09:19:28 crc kubenswrapper[4841]: E0313 09:19:28.687443 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01c0a6f121ad398d9cb4ed9b110cd0db457df5afe4f0bef1db46c6f8b4a1159\": container with ID starting with f01c0a6f121ad398d9cb4ed9b110cd0db457df5afe4f0bef1db46c6f8b4a1159 not found: ID does not exist" containerID="f01c0a6f121ad398d9cb4ed9b110cd0db457df5afe4f0bef1db46c6f8b4a1159" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.687472 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01c0a6f121ad398d9cb4ed9b110cd0db457df5afe4f0bef1db46c6f8b4a1159"} err="failed to get container status \"f01c0a6f121ad398d9cb4ed9b110cd0db457df5afe4f0bef1db46c6f8b4a1159\": rpc error: code = NotFound desc = could not find container \"f01c0a6f121ad398d9cb4ed9b110cd0db457df5afe4f0bef1db46c6f8b4a1159\": container with ID starting with f01c0a6f121ad398d9cb4ed9b110cd0db457df5afe4f0bef1db46c6f8b4a1159 not found: ID does not exist" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.687492 4841 scope.go:117] "RemoveContainer" containerID="874212a9cf144263de3415092f0d925e0a8f4275de354040703271034e73b068" Mar 13 09:19:28 crc kubenswrapper[4841]: E0313 09:19:28.687692 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"874212a9cf144263de3415092f0d925e0a8f4275de354040703271034e73b068\": container with ID starting with 874212a9cf144263de3415092f0d925e0a8f4275de354040703271034e73b068 not found: ID does not exist" containerID="874212a9cf144263de3415092f0d925e0a8f4275de354040703271034e73b068" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.687713 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"874212a9cf144263de3415092f0d925e0a8f4275de354040703271034e73b068"} err="failed to get container status \"874212a9cf144263de3415092f0d925e0a8f4275de354040703271034e73b068\": rpc error: code = NotFound desc = could not find container \"874212a9cf144263de3415092f0d925e0a8f4275de354040703271034e73b068\": container with ID starting with 874212a9cf144263de3415092f0d925e0a8f4275de354040703271034e73b068 not found: ID does not exist" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.687727 4841 scope.go:117] "RemoveContainer" containerID="2da51e28574ea04f227eafd23757b04738a3d44b6b2b97f090a3f0044b43f499" Mar 13 09:19:28 crc kubenswrapper[4841]: E0313 09:19:28.687871 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2da51e28574ea04f227eafd23757b04738a3d44b6b2b97f090a3f0044b43f499\": container with ID starting with 2da51e28574ea04f227eafd23757b04738a3d44b6b2b97f090a3f0044b43f499 not found: ID does not exist" containerID="2da51e28574ea04f227eafd23757b04738a3d44b6b2b97f090a3f0044b43f499" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.687889 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2da51e28574ea04f227eafd23757b04738a3d44b6b2b97f090a3f0044b43f499"} err="failed to get container status \"2da51e28574ea04f227eafd23757b04738a3d44b6b2b97f090a3f0044b43f499\": rpc error: code = NotFound desc = could not find container \"2da51e28574ea04f227eafd23757b04738a3d44b6b2b97f090a3f0044b43f499\": container with ID starting with 2da51e28574ea04f227eafd23757b04738a3d44b6b2b97f090a3f0044b43f499 not found: ID does not exist" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.687899 4841 scope.go:117] "RemoveContainer" containerID="140aa11126c12015df8cca4639a6bd3d5f246e66664fbec13c6e3988d78ba0ed" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.709404 4841 scope.go:117] "RemoveContainer" containerID="40233fdc0ab03899e16347d7da27f8e50c5ae6b4982ae08f42c76b1e98e73c26" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.713498 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sbx6r"] Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.736614 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sbx6r"] Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.740314 4841 scope.go:117] "RemoveContainer" containerID="6b81fe8aff61476cc189ebdf332e56f5d10bb7a0636ed98208d8d5e24c0c6279" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.745896 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l6qx4"] Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.754759 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l6qx4"] Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.755196 4841 scope.go:117] "RemoveContainer" containerID="140aa11126c12015df8cca4639a6bd3d5f246e66664fbec13c6e3988d78ba0ed" Mar 13 09:19:28 crc kubenswrapper[4841]: E0313 09:19:28.758439 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"140aa11126c12015df8cca4639a6bd3d5f246e66664fbec13c6e3988d78ba0ed\": container with ID starting with 140aa11126c12015df8cca4639a6bd3d5f246e66664fbec13c6e3988d78ba0ed not found: ID does not exist" containerID="140aa11126c12015df8cca4639a6bd3d5f246e66664fbec13c6e3988d78ba0ed" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.758481 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"140aa11126c12015df8cca4639a6bd3d5f246e66664fbec13c6e3988d78ba0ed"} err="failed to get container status \"140aa11126c12015df8cca4639a6bd3d5f246e66664fbec13c6e3988d78ba0ed\": rpc error: code = NotFound desc = could not find container \"140aa11126c12015df8cca4639a6bd3d5f246e66664fbec13c6e3988d78ba0ed\": container with ID starting with 140aa11126c12015df8cca4639a6bd3d5f246e66664fbec13c6e3988d78ba0ed not found: ID does not exist" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.758509 4841 scope.go:117] "RemoveContainer" containerID="40233fdc0ab03899e16347d7da27f8e50c5ae6b4982ae08f42c76b1e98e73c26" Mar 13 09:19:28 crc kubenswrapper[4841]: E0313 09:19:28.758759 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40233fdc0ab03899e16347d7da27f8e50c5ae6b4982ae08f42c76b1e98e73c26\": container with ID starting with 40233fdc0ab03899e16347d7da27f8e50c5ae6b4982ae08f42c76b1e98e73c26 not found: ID does not exist" containerID="40233fdc0ab03899e16347d7da27f8e50c5ae6b4982ae08f42c76b1e98e73c26" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.758775 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40233fdc0ab03899e16347d7da27f8e50c5ae6b4982ae08f42c76b1e98e73c26"} err="failed to get container status \"40233fdc0ab03899e16347d7da27f8e50c5ae6b4982ae08f42c76b1e98e73c26\": rpc error: code = NotFound desc = could not find container \"40233fdc0ab03899e16347d7da27f8e50c5ae6b4982ae08f42c76b1e98e73c26\": container with ID starting with 40233fdc0ab03899e16347d7da27f8e50c5ae6b4982ae08f42c76b1e98e73c26 not found: ID does not exist" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.758787 4841 scope.go:117] "RemoveContainer" containerID="6b81fe8aff61476cc189ebdf332e56f5d10bb7a0636ed98208d8d5e24c0c6279" Mar 13 09:19:28 crc kubenswrapper[4841]: E0313 09:19:28.759454 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b81fe8aff61476cc189ebdf332e56f5d10bb7a0636ed98208d8d5e24c0c6279\": container with ID starting with 6b81fe8aff61476cc189ebdf332e56f5d10bb7a0636ed98208d8d5e24c0c6279 not found: ID does not exist" containerID="6b81fe8aff61476cc189ebdf332e56f5d10bb7a0636ed98208d8d5e24c0c6279" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.759470 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b81fe8aff61476cc189ebdf332e56f5d10bb7a0636ed98208d8d5e24c0c6279"} err="failed to get container status \"6b81fe8aff61476cc189ebdf332e56f5d10bb7a0636ed98208d8d5e24c0c6279\": rpc error: code = NotFound desc = could not find container \"6b81fe8aff61476cc189ebdf332e56f5d10bb7a0636ed98208d8d5e24c0c6279\": container with ID starting with 6b81fe8aff61476cc189ebdf332e56f5d10bb7a0636ed98208d8d5e24c0c6279 not found: ID does not exist" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.759483 4841 scope.go:117] "RemoveContainer" containerID="ca8d720134276e45f89eb1988d60c9dd00b24ad319c7c2a11b282c50d94c6a0a" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.771337 4841 scope.go:117] "RemoveContainer" containerID="ca8d720134276e45f89eb1988d60c9dd00b24ad319c7c2a11b282c50d94c6a0a" Mar 13 09:19:28 crc kubenswrapper[4841]: E0313 09:19:28.771772 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca8d720134276e45f89eb1988d60c9dd00b24ad319c7c2a11b282c50d94c6a0a\": container with ID starting with ca8d720134276e45f89eb1988d60c9dd00b24ad319c7c2a11b282c50d94c6a0a not found: ID does not exist" containerID="ca8d720134276e45f89eb1988d60c9dd00b24ad319c7c2a11b282c50d94c6a0a" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.771810 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca8d720134276e45f89eb1988d60c9dd00b24ad319c7c2a11b282c50d94c6a0a"} err="failed to get container status \"ca8d720134276e45f89eb1988d60c9dd00b24ad319c7c2a11b282c50d94c6a0a\": rpc error: code = NotFound desc = could not find container \"ca8d720134276e45f89eb1988d60c9dd00b24ad319c7c2a11b282c50d94c6a0a\": container with ID starting with ca8d720134276e45f89eb1988d60c9dd00b24ad319c7c2a11b282c50d94c6a0a not found: ID does not exist" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.771835 4841 scope.go:117] "RemoveContainer" containerID="7081422dd20bdd7244829e876963f21903d1e9db1589b8ec5a050cbea54cf15f" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.784681 4841 scope.go:117] "RemoveContainer" containerID="578afedc43679c1cbb948b5e1ea531f5c5a160ac649e1bb7c71b933c9fb0febb" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.801181 4841 scope.go:117] "RemoveContainer" containerID="5b12af26a3525fd6b866a241e444437eee195694fa8814f0edfbc186b955bc96" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.815622 4841 scope.go:117] "RemoveContainer" containerID="7081422dd20bdd7244829e876963f21903d1e9db1589b8ec5a050cbea54cf15f" Mar 13 09:19:28 crc kubenswrapper[4841]: E0313 09:19:28.816021 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7081422dd20bdd7244829e876963f21903d1e9db1589b8ec5a050cbea54cf15f\": container with ID starting with 7081422dd20bdd7244829e876963f21903d1e9db1589b8ec5a050cbea54cf15f not found: ID does not exist" containerID="7081422dd20bdd7244829e876963f21903d1e9db1589b8ec5a050cbea54cf15f" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.816051 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7081422dd20bdd7244829e876963f21903d1e9db1589b8ec5a050cbea54cf15f"} err="failed to get container status \"7081422dd20bdd7244829e876963f21903d1e9db1589b8ec5a050cbea54cf15f\": rpc error: code = NotFound desc = could not find container \"7081422dd20bdd7244829e876963f21903d1e9db1589b8ec5a050cbea54cf15f\": container with ID starting with 7081422dd20bdd7244829e876963f21903d1e9db1589b8ec5a050cbea54cf15f not found: ID does not exist" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.816072 4841 scope.go:117] "RemoveContainer" containerID="578afedc43679c1cbb948b5e1ea531f5c5a160ac649e1bb7c71b933c9fb0febb" Mar 13 09:19:28 crc kubenswrapper[4841]: E0313 09:19:28.816432 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"578afedc43679c1cbb948b5e1ea531f5c5a160ac649e1bb7c71b933c9fb0febb\": container with ID starting with 578afedc43679c1cbb948b5e1ea531f5c5a160ac649e1bb7c71b933c9fb0febb not found: ID does not exist" containerID="578afedc43679c1cbb948b5e1ea531f5c5a160ac649e1bb7c71b933c9fb0febb" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.816452 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"578afedc43679c1cbb948b5e1ea531f5c5a160ac649e1bb7c71b933c9fb0febb"} err="failed to get container status \"578afedc43679c1cbb948b5e1ea531f5c5a160ac649e1bb7c71b933c9fb0febb\": rpc error: code = NotFound desc = could not find container \"578afedc43679c1cbb948b5e1ea531f5c5a160ac649e1bb7c71b933c9fb0febb\": container with ID starting with 578afedc43679c1cbb948b5e1ea531f5c5a160ac649e1bb7c71b933c9fb0febb not found: ID does not exist" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.816465 4841 scope.go:117] "RemoveContainer" containerID="5b12af26a3525fd6b866a241e444437eee195694fa8814f0edfbc186b955bc96" Mar 13 09:19:28 crc kubenswrapper[4841]: E0313 09:19:28.816759 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b12af26a3525fd6b866a241e444437eee195694fa8814f0edfbc186b955bc96\": container with ID starting with 5b12af26a3525fd6b866a241e444437eee195694fa8814f0edfbc186b955bc96 not found: ID does not exist" containerID="5b12af26a3525fd6b866a241e444437eee195694fa8814f0edfbc186b955bc96" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.816799 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b12af26a3525fd6b866a241e444437eee195694fa8814f0edfbc186b955bc96"} err="failed to get container status \"5b12af26a3525fd6b866a241e444437eee195694fa8814f0edfbc186b955bc96\": rpc error: code = NotFound desc = could not find container \"5b12af26a3525fd6b866a241e444437eee195694fa8814f0edfbc186b955bc96\": container with ID starting with 5b12af26a3525fd6b866a241e444437eee195694fa8814f0edfbc186b955bc96 not found: ID does not exist" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.816824 4841 scope.go:117] "RemoveContainer" containerID="f8b2d47f9b4c37d297302ce64c0f0c86db444f63756040c4f6a870113a679b28" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.831920 4841 scope.go:117] "RemoveContainer" containerID="9ed338d23007d7b03a788eedfcb08c7bc25a687ef3b68048f7622d6e6576bdcb" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.849675 4841 scope.go:117] "RemoveContainer" containerID="b95c4404ab7c732079660543e94d5af77ba8c5a73e7403dc9c9004b0216ff4e1" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.865317 4841 scope.go:117] "RemoveContainer" containerID="f8b2d47f9b4c37d297302ce64c0f0c86db444f63756040c4f6a870113a679b28" Mar 13 09:19:28 crc kubenswrapper[4841]: E0313 09:19:28.865978 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8b2d47f9b4c37d297302ce64c0f0c86db444f63756040c4f6a870113a679b28\": container with ID starting with f8b2d47f9b4c37d297302ce64c0f0c86db444f63756040c4f6a870113a679b28 not found: ID does not exist" containerID="f8b2d47f9b4c37d297302ce64c0f0c86db444f63756040c4f6a870113a679b28" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.866029 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8b2d47f9b4c37d297302ce64c0f0c86db444f63756040c4f6a870113a679b28"} err="failed to get container status \"f8b2d47f9b4c37d297302ce64c0f0c86db444f63756040c4f6a870113a679b28\": rpc error: code = NotFound desc = could not find container \"f8b2d47f9b4c37d297302ce64c0f0c86db444f63756040c4f6a870113a679b28\": container with ID starting with f8b2d47f9b4c37d297302ce64c0f0c86db444f63756040c4f6a870113a679b28 not found: ID does not exist" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.866062 4841 scope.go:117] "RemoveContainer" containerID="9ed338d23007d7b03a788eedfcb08c7bc25a687ef3b68048f7622d6e6576bdcb" Mar 13 09:19:28 crc kubenswrapper[4841]: E0313 09:19:28.866444 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ed338d23007d7b03a788eedfcb08c7bc25a687ef3b68048f7622d6e6576bdcb\": container with ID starting with 9ed338d23007d7b03a788eedfcb08c7bc25a687ef3b68048f7622d6e6576bdcb not found: ID does not exist" containerID="9ed338d23007d7b03a788eedfcb08c7bc25a687ef3b68048f7622d6e6576bdcb" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.866471 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ed338d23007d7b03a788eedfcb08c7bc25a687ef3b68048f7622d6e6576bdcb"} err="failed to get container status \"9ed338d23007d7b03a788eedfcb08c7bc25a687ef3b68048f7622d6e6576bdcb\": rpc error: code = NotFound desc = could not find container \"9ed338d23007d7b03a788eedfcb08c7bc25a687ef3b68048f7622d6e6576bdcb\": container with ID starting with 9ed338d23007d7b03a788eedfcb08c7bc25a687ef3b68048f7622d6e6576bdcb not found: ID does not exist" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.866495 4841 scope.go:117] "RemoveContainer" containerID="b95c4404ab7c732079660543e94d5af77ba8c5a73e7403dc9c9004b0216ff4e1" Mar 13 09:19:28 crc kubenswrapper[4841]: E0313 09:19:28.867491 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b95c4404ab7c732079660543e94d5af77ba8c5a73e7403dc9c9004b0216ff4e1\": container with ID starting with b95c4404ab7c732079660543e94d5af77ba8c5a73e7403dc9c9004b0216ff4e1 not found: ID does not exist" containerID="b95c4404ab7c732079660543e94d5af77ba8c5a73e7403dc9c9004b0216ff4e1" Mar 13 09:19:28 crc kubenswrapper[4841]: I0313 09:19:28.867522 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b95c4404ab7c732079660543e94d5af77ba8c5a73e7403dc9c9004b0216ff4e1"} err="failed to get container status \"b95c4404ab7c732079660543e94d5af77ba8c5a73e7403dc9c9004b0216ff4e1\": rpc error: code = NotFound desc = could not find container \"b95c4404ab7c732079660543e94d5af77ba8c5a73e7403dc9c9004b0216ff4e1\": container with ID starting with b95c4404ab7c732079660543e94d5af77ba8c5a73e7403dc9c9004b0216ff4e1 not found: ID does not exist" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.636296 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4n6sl" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.979375 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-96j6p"] Mar 13 09:19:29 crc kubenswrapper[4841]: E0313 09:19:29.980206 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e22cd83-5a44-4048-a618-4c06f3550ace" containerName="extract-utilities" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.980248 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e22cd83-5a44-4048-a618-4c06f3550ace" containerName="extract-utilities" Mar 13 09:19:29 crc kubenswrapper[4841]: E0313 09:19:29.980304 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d2682a-a65f-42e2-875a-b4247bfff054" containerName="extract-utilities" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.980317 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d2682a-a65f-42e2-875a-b4247bfff054" containerName="extract-utilities" Mar 13 09:19:29 crc kubenswrapper[4841]: E0313 09:19:29.980325 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81521243-f7e1-4360-9b12-8047988a69dd" containerName="registry-server" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.980333 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="81521243-f7e1-4360-9b12-8047988a69dd" containerName="registry-server" Mar 13 09:19:29 crc kubenswrapper[4841]: E0313 09:19:29.980342 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d2682a-a65f-42e2-875a-b4247bfff054" containerName="registry-server" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.980348 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d2682a-a65f-42e2-875a-b4247bfff054" containerName="registry-server" Mar 13 09:19:29 crc kubenswrapper[4841]: E0313 09:19:29.980385 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81521243-f7e1-4360-9b12-8047988a69dd" containerName="extract-content" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.980393 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="81521243-f7e1-4360-9b12-8047988a69dd" containerName="extract-content" Mar 13 09:19:29 crc kubenswrapper[4841]: E0313 09:19:29.980400 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd9953a-618b-4cd2-806b-c01e07c40fc2" containerName="registry-server" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.980405 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd9953a-618b-4cd2-806b-c01e07c40fc2" containerName="registry-server" Mar 13 09:19:29 crc kubenswrapper[4841]: E0313 09:19:29.980414 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd9953a-618b-4cd2-806b-c01e07c40fc2" containerName="extract-content" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.980419 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd9953a-618b-4cd2-806b-c01e07c40fc2" containerName="extract-content" Mar 13 09:19:29 crc kubenswrapper[4841]: E0313 09:19:29.980430 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81521243-f7e1-4360-9b12-8047988a69dd" containerName="extract-utilities" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.980437 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="81521243-f7e1-4360-9b12-8047988a69dd" containerName="extract-utilities" Mar 13 09:19:29 crc kubenswrapper[4841]: E0313 09:19:29.980477 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d2682a-a65f-42e2-875a-b4247bfff054" containerName="extract-content" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.980617 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d2682a-a65f-42e2-875a-b4247bfff054" containerName="extract-content" Mar 13 09:19:29 crc kubenswrapper[4841]: E0313 09:19:29.980637 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e22cd83-5a44-4048-a618-4c06f3550ace" containerName="extract-content" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.980646 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e22cd83-5a44-4048-a618-4c06f3550ace" containerName="extract-content" Mar 13 09:19:29 crc kubenswrapper[4841]: E0313 09:19:29.980655 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad88931-0cb1-40fd-b256-f9cd1c93a7e6" containerName="marketplace-operator" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.980662 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad88931-0cb1-40fd-b256-f9cd1c93a7e6" containerName="marketplace-operator" Mar 13 09:19:29 crc kubenswrapper[4841]: E0313 09:19:29.980700 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e22cd83-5a44-4048-a618-4c06f3550ace" containerName="registry-server" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.980709 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e22cd83-5a44-4048-a618-4c06f3550ace" containerName="registry-server" Mar 13 09:19:29 crc kubenswrapper[4841]: E0313 09:19:29.980717 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd9953a-618b-4cd2-806b-c01e07c40fc2" containerName="extract-utilities" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.980725 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd9953a-618b-4cd2-806b-c01e07c40fc2" containerName="extract-utilities" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.980856 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fad88931-0cb1-40fd-b256-f9cd1c93a7e6" containerName="marketplace-operator" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.980868 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e22cd83-5a44-4048-a618-4c06f3550ace" containerName="registry-server" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.980876 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d2682a-a65f-42e2-875a-b4247bfff054" containerName="registry-server" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.980886 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="81521243-f7e1-4360-9b12-8047988a69dd" containerName="registry-server" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.980896 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd9953a-618b-4cd2-806b-c01e07c40fc2" containerName="registry-server" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.982450 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-96j6p" Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.985149 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-96j6p"] Mar 13 09:19:29 crc kubenswrapper[4841]: I0313 09:19:29.985906 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.002613 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67d2682a-a65f-42e2-875a-b4247bfff054" path="/var/lib/kubelet/pods/67d2682a-a65f-42e2-875a-b4247bfff054/volumes" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.003737 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e22cd83-5a44-4048-a618-4c06f3550ace" path="/var/lib/kubelet/pods/6e22cd83-5a44-4048-a618-4c06f3550ace/volumes" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.004470 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81521243-f7e1-4360-9b12-8047988a69dd" path="/var/lib/kubelet/pods/81521243-f7e1-4360-9b12-8047988a69dd/volumes" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.006328 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbd9953a-618b-4cd2-806b-c01e07c40fc2" path="/var/lib/kubelet/pods/cbd9953a-618b-4cd2-806b-c01e07c40fc2/volumes" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.007184 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fad88931-0cb1-40fd-b256-f9cd1c93a7e6" path="/var/lib/kubelet/pods/fad88931-0cb1-40fd-b256-f9cd1c93a7e6/volumes" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.025121 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d8c242-fbc8-4c6a-93b1-146498533256-utilities\") pod \"certified-operators-96j6p\" (UID: \"44d8c242-fbc8-4c6a-93b1-146498533256\") " pod="openshift-marketplace/certified-operators-96j6p" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.025189 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d8c242-fbc8-4c6a-93b1-146498533256-catalog-content\") pod \"certified-operators-96j6p\" (UID: \"44d8c242-fbc8-4c6a-93b1-146498533256\") " pod="openshift-marketplace/certified-operators-96j6p" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.025251 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lgcc\" (UniqueName: \"kubernetes.io/projected/44d8c242-fbc8-4c6a-93b1-146498533256-kube-api-access-2lgcc\") pod \"certified-operators-96j6p\" (UID: \"44d8c242-fbc8-4c6a-93b1-146498533256\") " pod="openshift-marketplace/certified-operators-96j6p" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.126024 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lgcc\" (UniqueName: \"kubernetes.io/projected/44d8c242-fbc8-4c6a-93b1-146498533256-kube-api-access-2lgcc\") pod \"certified-operators-96j6p\" (UID: \"44d8c242-fbc8-4c6a-93b1-146498533256\") " pod="openshift-marketplace/certified-operators-96j6p" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.126086 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d8c242-fbc8-4c6a-93b1-146498533256-utilities\") pod \"certified-operators-96j6p\" (UID: \"44d8c242-fbc8-4c6a-93b1-146498533256\") " pod="openshift-marketplace/certified-operators-96j6p" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.126120 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d8c242-fbc8-4c6a-93b1-146498533256-catalog-content\") pod \"certified-operators-96j6p\" (UID: \"44d8c242-fbc8-4c6a-93b1-146498533256\") " pod="openshift-marketplace/certified-operators-96j6p" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.126550 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d8c242-fbc8-4c6a-93b1-146498533256-catalog-content\") pod \"certified-operators-96j6p\" (UID: \"44d8c242-fbc8-4c6a-93b1-146498533256\") " pod="openshift-marketplace/certified-operators-96j6p" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.126581 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d8c242-fbc8-4c6a-93b1-146498533256-utilities\") pod \"certified-operators-96j6p\" (UID: \"44d8c242-fbc8-4c6a-93b1-146498533256\") " pod="openshift-marketplace/certified-operators-96j6p" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.143974 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lgcc\" (UniqueName: \"kubernetes.io/projected/44d8c242-fbc8-4c6a-93b1-146498533256-kube-api-access-2lgcc\") pod \"certified-operators-96j6p\" (UID: \"44d8c242-fbc8-4c6a-93b1-146498533256\") " pod="openshift-marketplace/certified-operators-96j6p" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.165074 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d6wpz"] Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.166237 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d6wpz" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.169463 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.177400 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d6wpz"] Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.226756 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/188a8141-ccd6-48e9-a3f0-4546fba14c1c-utilities\") pod \"community-operators-d6wpz\" (UID: \"188a8141-ccd6-48e9-a3f0-4546fba14c1c\") " pod="openshift-marketplace/community-operators-d6wpz" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.226816 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/188a8141-ccd6-48e9-a3f0-4546fba14c1c-catalog-content\") pod \"community-operators-d6wpz\" (UID: \"188a8141-ccd6-48e9-a3f0-4546fba14c1c\") " pod="openshift-marketplace/community-operators-d6wpz" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.226852 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59g5d\" (UniqueName: \"kubernetes.io/projected/188a8141-ccd6-48e9-a3f0-4546fba14c1c-kube-api-access-59g5d\") pod \"community-operators-d6wpz\" (UID: \"188a8141-ccd6-48e9-a3f0-4546fba14c1c\") " pod="openshift-marketplace/community-operators-d6wpz" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.312020 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-96j6p" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.328353 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59g5d\" (UniqueName: \"kubernetes.io/projected/188a8141-ccd6-48e9-a3f0-4546fba14c1c-kube-api-access-59g5d\") pod \"community-operators-d6wpz\" (UID: \"188a8141-ccd6-48e9-a3f0-4546fba14c1c\") " pod="openshift-marketplace/community-operators-d6wpz" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.328480 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/188a8141-ccd6-48e9-a3f0-4546fba14c1c-utilities\") pod \"community-operators-d6wpz\" (UID: \"188a8141-ccd6-48e9-a3f0-4546fba14c1c\") " pod="openshift-marketplace/community-operators-d6wpz" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.328544 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/188a8141-ccd6-48e9-a3f0-4546fba14c1c-catalog-content\") pod \"community-operators-d6wpz\" (UID: \"188a8141-ccd6-48e9-a3f0-4546fba14c1c\") " pod="openshift-marketplace/community-operators-d6wpz" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.329051 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/188a8141-ccd6-48e9-a3f0-4546fba14c1c-utilities\") pod \"community-operators-d6wpz\" (UID: \"188a8141-ccd6-48e9-a3f0-4546fba14c1c\") " pod="openshift-marketplace/community-operators-d6wpz" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.330280 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/188a8141-ccd6-48e9-a3f0-4546fba14c1c-catalog-content\") pod \"community-operators-d6wpz\" (UID: \"188a8141-ccd6-48e9-a3f0-4546fba14c1c\") " pod="openshift-marketplace/community-operators-d6wpz" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.356145 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59g5d\" (UniqueName: \"kubernetes.io/projected/188a8141-ccd6-48e9-a3f0-4546fba14c1c-kube-api-access-59g5d\") pod \"community-operators-d6wpz\" (UID: \"188a8141-ccd6-48e9-a3f0-4546fba14c1c\") " pod="openshift-marketplace/community-operators-d6wpz" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.530478 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d6wpz" Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.692108 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d6wpz"] Mar 13 09:19:30 crc kubenswrapper[4841]: I0313 09:19:30.727439 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-96j6p"] Mar 13 09:19:30 crc kubenswrapper[4841]: W0313 09:19:30.732481 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44d8c242_fbc8_4c6a_93b1_146498533256.slice/crio-7f6434487ef399025565c9373b55bb1f868f935ce4f7e82b746034d2e2890ee8 WatchSource:0}: Error finding container 7f6434487ef399025565c9373b55bb1f868f935ce4f7e82b746034d2e2890ee8: Status 404 returned error can't find the container with id 7f6434487ef399025565c9373b55bb1f868f935ce4f7e82b746034d2e2890ee8 Mar 13 09:19:31 crc kubenswrapper[4841]: I0313 09:19:31.652857 4841 generic.go:334] "Generic (PLEG): container finished" podID="44d8c242-fbc8-4c6a-93b1-146498533256" containerID="363a090a06bd7351800c990584ddc5c896bd5b286aeabd250f4a0c6b85fa9fe3" exitCode=0 Mar 13 09:19:31 crc kubenswrapper[4841]: I0313 09:19:31.652956 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96j6p" event={"ID":"44d8c242-fbc8-4c6a-93b1-146498533256","Type":"ContainerDied","Data":"363a090a06bd7351800c990584ddc5c896bd5b286aeabd250f4a0c6b85fa9fe3"} Mar 13 09:19:31 crc kubenswrapper[4841]: I0313 09:19:31.652993 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96j6p" event={"ID":"44d8c242-fbc8-4c6a-93b1-146498533256","Type":"ContainerStarted","Data":"7f6434487ef399025565c9373b55bb1f868f935ce4f7e82b746034d2e2890ee8"} Mar 13 09:19:31 crc kubenswrapper[4841]: I0313 09:19:31.655596 4841 generic.go:334] "Generic (PLEG): container finished" podID="188a8141-ccd6-48e9-a3f0-4546fba14c1c" containerID="11619ecb1a182e5acc3fb7f30a62ae71a448091c192925113e3a3deb2842899f" exitCode=0 Mar 13 09:19:31 crc kubenswrapper[4841]: I0313 09:19:31.655660 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6wpz" event={"ID":"188a8141-ccd6-48e9-a3f0-4546fba14c1c","Type":"ContainerDied","Data":"11619ecb1a182e5acc3fb7f30a62ae71a448091c192925113e3a3deb2842899f"} Mar 13 09:19:31 crc kubenswrapper[4841]: I0313 09:19:31.655702 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6wpz" event={"ID":"188a8141-ccd6-48e9-a3f0-4546fba14c1c","Type":"ContainerStarted","Data":"f600578a22c7cde8db21f2700f565432a02aee61b47bb1ee06859d4e7ea390e3"} Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.381373 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mgpqr"] Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.382613 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mgpqr" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.383730 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mgpqr"] Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.384624 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.567845 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdqnm\" (UniqueName: \"kubernetes.io/projected/6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c-kube-api-access-qdqnm\") pod \"redhat-marketplace-mgpqr\" (UID: \"6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c\") " pod="openshift-marketplace/redhat-marketplace-mgpqr" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.568200 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c-utilities\") pod \"redhat-marketplace-mgpqr\" (UID: \"6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c\") " pod="openshift-marketplace/redhat-marketplace-mgpqr" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.568350 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c-catalog-content\") pod \"redhat-marketplace-mgpqr\" (UID: \"6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c\") " pod="openshift-marketplace/redhat-marketplace-mgpqr" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.570535 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4pqqf"] Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.571656 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4pqqf" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.574753 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.581172 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4pqqf"] Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.662244 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6wpz" event={"ID":"188a8141-ccd6-48e9-a3f0-4546fba14c1c","Type":"ContainerStarted","Data":"c26bf592c2c3bff93d5095c72eae61faad2163c12dcf531067caf839b9b0a253"} Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.668923 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5099aa18-a4a1-40d1-b8c2-dc8a5a26e912-catalog-content\") pod \"redhat-operators-4pqqf\" (UID: \"5099aa18-a4a1-40d1-b8c2-dc8a5a26e912\") " pod="openshift-marketplace/redhat-operators-4pqqf" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.668958 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmxvs\" (UniqueName: \"kubernetes.io/projected/5099aa18-a4a1-40d1-b8c2-dc8a5a26e912-kube-api-access-kmxvs\") pod \"redhat-operators-4pqqf\" (UID: \"5099aa18-a4a1-40d1-b8c2-dc8a5a26e912\") " pod="openshift-marketplace/redhat-operators-4pqqf" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.668989 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c-utilities\") pod \"redhat-marketplace-mgpqr\" (UID: \"6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c\") " pod="openshift-marketplace/redhat-marketplace-mgpqr" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.669163 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c-catalog-content\") pod \"redhat-marketplace-mgpqr\" (UID: \"6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c\") " pod="openshift-marketplace/redhat-marketplace-mgpqr" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.669306 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5099aa18-a4a1-40d1-b8c2-dc8a5a26e912-utilities\") pod \"redhat-operators-4pqqf\" (UID: \"5099aa18-a4a1-40d1-b8c2-dc8a5a26e912\") " pod="openshift-marketplace/redhat-operators-4pqqf" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.669483 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdqnm\" (UniqueName: \"kubernetes.io/projected/6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c-kube-api-access-qdqnm\") pod \"redhat-marketplace-mgpqr\" (UID: \"6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c\") " pod="openshift-marketplace/redhat-marketplace-mgpqr" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.669509 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c-utilities\") pod \"redhat-marketplace-mgpqr\" (UID: \"6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c\") " pod="openshift-marketplace/redhat-marketplace-mgpqr" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.669633 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c-catalog-content\") pod \"redhat-marketplace-mgpqr\" (UID: \"6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c\") " pod="openshift-marketplace/redhat-marketplace-mgpqr" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.689515 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdqnm\" (UniqueName: \"kubernetes.io/projected/6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c-kube-api-access-qdqnm\") pod \"redhat-marketplace-mgpqr\" (UID: \"6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c\") " pod="openshift-marketplace/redhat-marketplace-mgpqr" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.712488 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mgpqr" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.770163 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5099aa18-a4a1-40d1-b8c2-dc8a5a26e912-utilities\") pod \"redhat-operators-4pqqf\" (UID: \"5099aa18-a4a1-40d1-b8c2-dc8a5a26e912\") " pod="openshift-marketplace/redhat-operators-4pqqf" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.770588 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5099aa18-a4a1-40d1-b8c2-dc8a5a26e912-catalog-content\") pod \"redhat-operators-4pqqf\" (UID: \"5099aa18-a4a1-40d1-b8c2-dc8a5a26e912\") " pod="openshift-marketplace/redhat-operators-4pqqf" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.770616 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmxvs\" (UniqueName: \"kubernetes.io/projected/5099aa18-a4a1-40d1-b8c2-dc8a5a26e912-kube-api-access-kmxvs\") pod \"redhat-operators-4pqqf\" (UID: \"5099aa18-a4a1-40d1-b8c2-dc8a5a26e912\") " pod="openshift-marketplace/redhat-operators-4pqqf" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.770721 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5099aa18-a4a1-40d1-b8c2-dc8a5a26e912-utilities\") pod \"redhat-operators-4pqqf\" (UID: \"5099aa18-a4a1-40d1-b8c2-dc8a5a26e912\") " pod="openshift-marketplace/redhat-operators-4pqqf" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.771011 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5099aa18-a4a1-40d1-b8c2-dc8a5a26e912-catalog-content\") pod \"redhat-operators-4pqqf\" (UID: \"5099aa18-a4a1-40d1-b8c2-dc8a5a26e912\") " pod="openshift-marketplace/redhat-operators-4pqqf" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.796932 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmxvs\" (UniqueName: \"kubernetes.io/projected/5099aa18-a4a1-40d1-b8c2-dc8a5a26e912-kube-api-access-kmxvs\") pod \"redhat-operators-4pqqf\" (UID: \"5099aa18-a4a1-40d1-b8c2-dc8a5a26e912\") " pod="openshift-marketplace/redhat-operators-4pqqf" Mar 13 09:19:32 crc kubenswrapper[4841]: I0313 09:19:32.897875 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4pqqf" Mar 13 09:19:33 crc kubenswrapper[4841]: I0313 09:19:33.056428 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4pqqf"] Mar 13 09:19:33 crc kubenswrapper[4841]: W0313 09:19:33.065104 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5099aa18_a4a1_40d1_b8c2_dc8a5a26e912.slice/crio-4e3ec1c51fc493744e7da16582fb3da68a8ff1a85dc59c14039171923d8dfa27 WatchSource:0}: Error finding container 4e3ec1c51fc493744e7da16582fb3da68a8ff1a85dc59c14039171923d8dfa27: Status 404 returned error can't find the container with id 4e3ec1c51fc493744e7da16582fb3da68a8ff1a85dc59c14039171923d8dfa27 Mar 13 09:19:33 crc kubenswrapper[4841]: I0313 09:19:33.122158 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mgpqr"] Mar 13 09:19:33 crc kubenswrapper[4841]: I0313 09:19:33.669342 4841 generic.go:334] "Generic (PLEG): container finished" podID="188a8141-ccd6-48e9-a3f0-4546fba14c1c" containerID="c26bf592c2c3bff93d5095c72eae61faad2163c12dcf531067caf839b9b0a253" exitCode=0 Mar 13 09:19:33 crc kubenswrapper[4841]: I0313 09:19:33.669404 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6wpz" event={"ID":"188a8141-ccd6-48e9-a3f0-4546fba14c1c","Type":"ContainerDied","Data":"c26bf592c2c3bff93d5095c72eae61faad2163c12dcf531067caf839b9b0a253"} Mar 13 09:19:33 crc kubenswrapper[4841]: I0313 09:19:33.670677 4841 generic.go:334] "Generic (PLEG): container finished" podID="6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c" containerID="0de72478faa901383837ce66b025570e9e0512a878d3c3bf3811d5bbc76f347c" exitCode=0 Mar 13 09:19:33 crc kubenswrapper[4841]: I0313 09:19:33.670733 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgpqr" event={"ID":"6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c","Type":"ContainerDied","Data":"0de72478faa901383837ce66b025570e9e0512a878d3c3bf3811d5bbc76f347c"} Mar 13 09:19:33 crc kubenswrapper[4841]: I0313 09:19:33.670754 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgpqr" event={"ID":"6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c","Type":"ContainerStarted","Data":"3774dc5e2df673bbe9b11ffc003801d7f56363760c564f52fa9eb8cda79d5f6b"} Mar 13 09:19:33 crc kubenswrapper[4841]: I0313 09:19:33.673905 4841 generic.go:334] "Generic (PLEG): container finished" podID="5099aa18-a4a1-40d1-b8c2-dc8a5a26e912" containerID="db106ef948a4444a963a16fe2ce9d125dafe1254ba6b2bf7f4b0856c989d772a" exitCode=0 Mar 13 09:19:33 crc kubenswrapper[4841]: I0313 09:19:33.673977 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4pqqf" event={"ID":"5099aa18-a4a1-40d1-b8c2-dc8a5a26e912","Type":"ContainerDied","Data":"db106ef948a4444a963a16fe2ce9d125dafe1254ba6b2bf7f4b0856c989d772a"} Mar 13 09:19:33 crc kubenswrapper[4841]: I0313 09:19:33.674010 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4pqqf" event={"ID":"5099aa18-a4a1-40d1-b8c2-dc8a5a26e912","Type":"ContainerStarted","Data":"4e3ec1c51fc493744e7da16582fb3da68a8ff1a85dc59c14039171923d8dfa27"} Mar 13 09:19:33 crc kubenswrapper[4841]: I0313 09:19:33.677078 4841 generic.go:334] "Generic (PLEG): container finished" podID="44d8c242-fbc8-4c6a-93b1-146498533256" containerID="8ca7295a8341bf35ae903769196441f6d984275f79023b0388a44b5558271304" exitCode=0 Mar 13 09:19:33 crc kubenswrapper[4841]: I0313 09:19:33.677115 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96j6p" event={"ID":"44d8c242-fbc8-4c6a-93b1-146498533256","Type":"ContainerDied","Data":"8ca7295a8341bf35ae903769196441f6d984275f79023b0388a44b5558271304"} Mar 13 09:19:34 crc kubenswrapper[4841]: I0313 09:19:34.406793 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:19:34 crc kubenswrapper[4841]: I0313 09:19:34.407067 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:19:34 crc kubenswrapper[4841]: I0313 09:19:34.683092 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6wpz" event={"ID":"188a8141-ccd6-48e9-a3f0-4546fba14c1c","Type":"ContainerStarted","Data":"b983b40cafb18d7eaa6ac163273313ba550e3197abaca5c7cc7f14ced5504c7a"} Mar 13 09:19:34 crc kubenswrapper[4841]: I0313 09:19:34.687966 4841 generic.go:334] "Generic (PLEG): container finished" podID="6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c" containerID="309a52c75b3e66c17f05b2f08bb52295020e8c0536fca78f292d301859f230fe" exitCode=0 Mar 13 09:19:34 crc kubenswrapper[4841]: I0313 09:19:34.688031 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgpqr" event={"ID":"6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c","Type":"ContainerDied","Data":"309a52c75b3e66c17f05b2f08bb52295020e8c0536fca78f292d301859f230fe"} Mar 13 09:19:34 crc kubenswrapper[4841]: I0313 09:19:34.689773 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96j6p" event={"ID":"44d8c242-fbc8-4c6a-93b1-146498533256","Type":"ContainerStarted","Data":"7abae731dc4f24dddd70efe70ec61ea3d009dff071350a8cf85a525e6c6cb5b3"} Mar 13 09:19:34 crc kubenswrapper[4841]: I0313 09:19:34.716919 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d6wpz" podStartSLOduration=2.077966674 podStartE2EDuration="4.716897259s" podCreationTimestamp="2026-03-13 09:19:30 +0000 UTC" firstStartedPulling="2026-03-13 09:19:31.657783184 +0000 UTC m=+454.387683415" lastFinishedPulling="2026-03-13 09:19:34.296713809 +0000 UTC m=+457.026614000" observedRunningTime="2026-03-13 09:19:34.714526045 +0000 UTC m=+457.444426256" watchObservedRunningTime="2026-03-13 09:19:34.716897259 +0000 UTC m=+457.446797450" Mar 13 09:19:34 crc kubenswrapper[4841]: I0313 09:19:34.758762 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-96j6p" podStartSLOduration=3.336778286 podStartE2EDuration="5.758747744s" podCreationTimestamp="2026-03-13 09:19:29 +0000 UTC" firstStartedPulling="2026-03-13 09:19:31.655449343 +0000 UTC m=+454.385349574" lastFinishedPulling="2026-03-13 09:19:34.077418801 +0000 UTC m=+456.807319032" observedRunningTime="2026-03-13 09:19:34.757563743 +0000 UTC m=+457.487463934" watchObservedRunningTime="2026-03-13 09:19:34.758747744 +0000 UTC m=+457.488647935" Mar 13 09:19:35 crc kubenswrapper[4841]: I0313 09:19:35.698832 4841 generic.go:334] "Generic (PLEG): container finished" podID="5099aa18-a4a1-40d1-b8c2-dc8a5a26e912" containerID="32177baa5d0d61654bb375f14311826ee3dc66a3eeb18fa875544be4796ee904" exitCode=0 Mar 13 09:19:35 crc kubenswrapper[4841]: I0313 09:19:35.698884 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4pqqf" event={"ID":"5099aa18-a4a1-40d1-b8c2-dc8a5a26e912","Type":"ContainerDied","Data":"32177baa5d0d61654bb375f14311826ee3dc66a3eeb18fa875544be4796ee904"} Mar 13 09:19:35 crc kubenswrapper[4841]: I0313 09:19:35.703677 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mgpqr" event={"ID":"6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c","Type":"ContainerStarted","Data":"55264fdbe0f2d130edcb403ea92dbb8043c11913eb29d2238e668139acbc7b3e"} Mar 13 09:19:35 crc kubenswrapper[4841]: I0313 09:19:35.747183 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mgpqr" podStartSLOduration=2.287759481 podStartE2EDuration="3.747139178s" podCreationTimestamp="2026-03-13 09:19:32 +0000 UTC" firstStartedPulling="2026-03-13 09:19:33.672332233 +0000 UTC m=+456.402232424" lastFinishedPulling="2026-03-13 09:19:35.13171192 +0000 UTC m=+457.861612121" observedRunningTime="2026-03-13 09:19:35.74309082 +0000 UTC m=+458.472991051" watchObservedRunningTime="2026-03-13 09:19:35.747139178 +0000 UTC m=+458.477039389" Mar 13 09:19:36 crc kubenswrapper[4841]: I0313 09:19:36.712580 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4pqqf" event={"ID":"5099aa18-a4a1-40d1-b8c2-dc8a5a26e912","Type":"ContainerStarted","Data":"0b1882b2845f4e1ce4fe589b4746ca9abfe10681853827aab7043e88d213e951"} Mar 13 09:19:36 crc kubenswrapper[4841]: I0313 09:19:36.734337 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4pqqf" podStartSLOduration=2.290855932 podStartE2EDuration="4.734320073s" podCreationTimestamp="2026-03-13 09:19:32 +0000 UTC" firstStartedPulling="2026-03-13 09:19:33.675550841 +0000 UTC m=+456.405451032" lastFinishedPulling="2026-03-13 09:19:36.119014982 +0000 UTC m=+458.848915173" observedRunningTime="2026-03-13 09:19:36.732047321 +0000 UTC m=+459.461947532" watchObservedRunningTime="2026-03-13 09:19:36.734320073 +0000 UTC m=+459.464220264" Mar 13 09:19:40 crc kubenswrapper[4841]: I0313 09:19:40.313141 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-96j6p" Mar 13 09:19:40 crc kubenswrapper[4841]: I0313 09:19:40.313469 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-96j6p" Mar 13 09:19:40 crc kubenswrapper[4841]: I0313 09:19:40.389250 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-96j6p" Mar 13 09:19:40 crc kubenswrapper[4841]: I0313 09:19:40.531960 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d6wpz" Mar 13 09:19:40 crc kubenswrapper[4841]: I0313 09:19:40.532034 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d6wpz" Mar 13 09:19:40 crc kubenswrapper[4841]: I0313 09:19:40.581067 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d6wpz" Mar 13 09:19:40 crc kubenswrapper[4841]: I0313 09:19:40.799159 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-96j6p" Mar 13 09:19:40 crc kubenswrapper[4841]: I0313 09:19:40.803956 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d6wpz" Mar 13 09:19:42 crc kubenswrapper[4841]: I0313 09:19:42.713165 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mgpqr" Mar 13 09:19:42 crc kubenswrapper[4841]: I0313 09:19:42.713517 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mgpqr" Mar 13 09:19:42 crc kubenswrapper[4841]: I0313 09:19:42.765551 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mgpqr" Mar 13 09:19:42 crc kubenswrapper[4841]: I0313 09:19:42.819524 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mgpqr" Mar 13 09:19:42 crc kubenswrapper[4841]: I0313 09:19:42.898642 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4pqqf" Mar 13 09:19:42 crc kubenswrapper[4841]: I0313 09:19:42.898714 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4pqqf" Mar 13 09:19:43 crc kubenswrapper[4841]: I0313 09:19:43.935072 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4pqqf" podUID="5099aa18-a4a1-40d1-b8c2-dc8a5a26e912" containerName="registry-server" probeResult="failure" output=< Mar 13 09:19:43 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Mar 13 09:19:43 crc kubenswrapper[4841]: > Mar 13 09:19:52 crc kubenswrapper[4841]: I0313 09:19:52.958229 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4pqqf" Mar 13 09:19:53 crc kubenswrapper[4841]: I0313 09:19:53.004051 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4pqqf" Mar 13 09:20:00 crc kubenswrapper[4841]: I0313 09:20:00.143259 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556560-ckrz8"] Mar 13 09:20:00 crc kubenswrapper[4841]: I0313 09:20:00.145502 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556560-ckrz8" Mar 13 09:20:00 crc kubenswrapper[4841]: I0313 09:20:00.148925 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:20:00 crc kubenswrapper[4841]: I0313 09:20:00.149006 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:20:00 crc kubenswrapper[4841]: I0313 09:20:00.149064 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:20:00 crc kubenswrapper[4841]: I0313 09:20:00.154853 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556560-ckrz8"] Mar 13 09:20:00 crc kubenswrapper[4841]: I0313 09:20:00.217433 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz27q\" (UniqueName: \"kubernetes.io/projected/d192b4a7-053d-4330-8b96-7f96ed70ea05-kube-api-access-mz27q\") pod \"auto-csr-approver-29556560-ckrz8\" (UID: \"d192b4a7-053d-4330-8b96-7f96ed70ea05\") " pod="openshift-infra/auto-csr-approver-29556560-ckrz8" Mar 13 09:20:00 crc kubenswrapper[4841]: I0313 09:20:00.319162 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz27q\" (UniqueName: \"kubernetes.io/projected/d192b4a7-053d-4330-8b96-7f96ed70ea05-kube-api-access-mz27q\") pod \"auto-csr-approver-29556560-ckrz8\" (UID: \"d192b4a7-053d-4330-8b96-7f96ed70ea05\") " pod="openshift-infra/auto-csr-approver-29556560-ckrz8" Mar 13 09:20:00 crc kubenswrapper[4841]: I0313 09:20:00.341741 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz27q\" (UniqueName: \"kubernetes.io/projected/d192b4a7-053d-4330-8b96-7f96ed70ea05-kube-api-access-mz27q\") pod \"auto-csr-approver-29556560-ckrz8\" (UID: \"d192b4a7-053d-4330-8b96-7f96ed70ea05\") " pod="openshift-infra/auto-csr-approver-29556560-ckrz8" Mar 13 09:20:00 crc kubenswrapper[4841]: I0313 09:20:00.509259 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556560-ckrz8" Mar 13 09:20:00 crc kubenswrapper[4841]: I0313 09:20:00.716134 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556560-ckrz8"] Mar 13 09:20:00 crc kubenswrapper[4841]: W0313 09:20:00.722395 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd192b4a7_053d_4330_8b96_7f96ed70ea05.slice/crio-8ef5c791a5bd5d1755a0bcbc776849fbfcc130c3f5ecea38cf049bf6367cbe64 WatchSource:0}: Error finding container 8ef5c791a5bd5d1755a0bcbc776849fbfcc130c3f5ecea38cf049bf6367cbe64: Status 404 returned error can't find the container with id 8ef5c791a5bd5d1755a0bcbc776849fbfcc130c3f5ecea38cf049bf6367cbe64 Mar 13 09:20:00 crc kubenswrapper[4841]: I0313 09:20:00.868552 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556560-ckrz8" event={"ID":"d192b4a7-053d-4330-8b96-7f96ed70ea05","Type":"ContainerStarted","Data":"8ef5c791a5bd5d1755a0bcbc776849fbfcc130c3f5ecea38cf049bf6367cbe64"} Mar 13 09:20:02 crc kubenswrapper[4841]: E0313 09:20:02.191223 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd192b4a7_053d_4330_8b96_7f96ed70ea05.slice/crio-3116b438849dc4c367e4eb6711fa0f8245afc382af2260ca5d8c441030e2d9a5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd192b4a7_053d_4330_8b96_7f96ed70ea05.slice/crio-conmon-3116b438849dc4c367e4eb6711fa0f8245afc382af2260ca5d8c441030e2d9a5.scope\": RecentStats: unable to find data in memory cache]" Mar 13 09:20:02 crc kubenswrapper[4841]: I0313 09:20:02.885190 4841 generic.go:334] "Generic (PLEG): container finished" podID="d192b4a7-053d-4330-8b96-7f96ed70ea05" containerID="3116b438849dc4c367e4eb6711fa0f8245afc382af2260ca5d8c441030e2d9a5" exitCode=0 Mar 13 09:20:02 crc kubenswrapper[4841]: I0313 09:20:02.885305 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556560-ckrz8" event={"ID":"d192b4a7-053d-4330-8b96-7f96ed70ea05","Type":"ContainerDied","Data":"3116b438849dc4c367e4eb6711fa0f8245afc382af2260ca5d8c441030e2d9a5"} Mar 13 09:20:04 crc kubenswrapper[4841]: I0313 09:20:04.382692 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556560-ckrz8" Mar 13 09:20:04 crc kubenswrapper[4841]: I0313 09:20:04.407094 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:20:04 crc kubenswrapper[4841]: I0313 09:20:04.407152 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:20:04 crc kubenswrapper[4841]: I0313 09:20:04.407197 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:20:04 crc kubenswrapper[4841]: I0313 09:20:04.407747 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d65f6d8d4c1b4e29b21c72e57a4e091f2bf10ce1c458bb3c65c2a8ccaf3e6167"} pod="openshift-machine-config-operator/machine-config-daemon-h227v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 09:20:04 crc kubenswrapper[4841]: I0313 09:20:04.407791 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" containerID="cri-o://d65f6d8d4c1b4e29b21c72e57a4e091f2bf10ce1c458bb3c65c2a8ccaf3e6167" gracePeriod=600 Mar 13 09:20:04 crc kubenswrapper[4841]: I0313 09:20:04.474017 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz27q\" (UniqueName: \"kubernetes.io/projected/d192b4a7-053d-4330-8b96-7f96ed70ea05-kube-api-access-mz27q\") pod \"d192b4a7-053d-4330-8b96-7f96ed70ea05\" (UID: \"d192b4a7-053d-4330-8b96-7f96ed70ea05\") " Mar 13 09:20:04 crc kubenswrapper[4841]: I0313 09:20:04.479258 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d192b4a7-053d-4330-8b96-7f96ed70ea05-kube-api-access-mz27q" (OuterVolumeSpecName: "kube-api-access-mz27q") pod "d192b4a7-053d-4330-8b96-7f96ed70ea05" (UID: "d192b4a7-053d-4330-8b96-7f96ed70ea05"). InnerVolumeSpecName "kube-api-access-mz27q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:20:04 crc kubenswrapper[4841]: I0313 09:20:04.574963 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz27q\" (UniqueName: \"kubernetes.io/projected/d192b4a7-053d-4330-8b96-7f96ed70ea05-kube-api-access-mz27q\") on node \"crc\" DevicePath \"\"" Mar 13 09:20:04 crc kubenswrapper[4841]: I0313 09:20:04.894700 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556560-ckrz8" event={"ID":"d192b4a7-053d-4330-8b96-7f96ed70ea05","Type":"ContainerDied","Data":"8ef5c791a5bd5d1755a0bcbc776849fbfcc130c3f5ecea38cf049bf6367cbe64"} Mar 13 09:20:04 crc kubenswrapper[4841]: I0313 09:20:04.894931 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ef5c791a5bd5d1755a0bcbc776849fbfcc130c3f5ecea38cf049bf6367cbe64" Mar 13 09:20:04 crc kubenswrapper[4841]: I0313 09:20:04.894716 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556560-ckrz8" Mar 13 09:20:04 crc kubenswrapper[4841]: I0313 09:20:04.897109 4841 generic.go:334] "Generic (PLEG): container finished" podID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerID="d65f6d8d4c1b4e29b21c72e57a4e091f2bf10ce1c458bb3c65c2a8ccaf3e6167" exitCode=0 Mar 13 09:20:04 crc kubenswrapper[4841]: I0313 09:20:04.897160 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerDied","Data":"d65f6d8d4c1b4e29b21c72e57a4e091f2bf10ce1c458bb3c65c2a8ccaf3e6167"} Mar 13 09:20:04 crc kubenswrapper[4841]: I0313 09:20:04.897198 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"379cc8a3a48d5aedfd454910bea9e163183fcf7244dc7566a308379fd2d7c084"} Mar 13 09:20:04 crc kubenswrapper[4841]: I0313 09:20:04.897221 4841 scope.go:117] "RemoveContainer" containerID="9a1f9151f552176e4557fbe7ae4dd3ddf3334120c1c0e6086773ec840d42b653" Mar 13 09:20:05 crc kubenswrapper[4841]: I0313 09:20:05.432123 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556554-2dgrd"] Mar 13 09:20:05 crc kubenswrapper[4841]: I0313 09:20:05.436001 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556554-2dgrd"] Mar 13 09:20:06 crc kubenswrapper[4841]: I0313 09:20:06.002597 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="344d8ef8-1693-4dcc-b09d-dc9fc7c041cc" path="/var/lib/kubelet/pods/344d8ef8-1693-4dcc-b09d-dc9fc7c041cc/volumes" Mar 13 09:22:00 crc kubenswrapper[4841]: I0313 09:22:00.150013 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556562-265fg"] Mar 13 09:22:00 crc kubenswrapper[4841]: E0313 09:22:00.151234 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d192b4a7-053d-4330-8b96-7f96ed70ea05" containerName="oc" Mar 13 09:22:00 crc kubenswrapper[4841]: I0313 09:22:00.151257 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d192b4a7-053d-4330-8b96-7f96ed70ea05" containerName="oc" Mar 13 09:22:00 crc kubenswrapper[4841]: I0313 09:22:00.151479 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d192b4a7-053d-4330-8b96-7f96ed70ea05" containerName="oc" Mar 13 09:22:00 crc kubenswrapper[4841]: I0313 09:22:00.152475 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556562-265fg" Mar 13 09:22:00 crc kubenswrapper[4841]: I0313 09:22:00.160085 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:22:00 crc kubenswrapper[4841]: I0313 09:22:00.161040 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:22:00 crc kubenswrapper[4841]: I0313 09:22:00.161557 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:22:00 crc kubenswrapper[4841]: I0313 09:22:00.165467 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556562-265fg"] Mar 13 09:22:00 crc kubenswrapper[4841]: I0313 09:22:00.301442 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm9gj\" (UniqueName: \"kubernetes.io/projected/62838214-0b31-49b3-bac8-7c0c8ce58141-kube-api-access-mm9gj\") pod \"auto-csr-approver-29556562-265fg\" (UID: \"62838214-0b31-49b3-bac8-7c0c8ce58141\") " pod="openshift-infra/auto-csr-approver-29556562-265fg" Mar 13 09:22:00 crc kubenswrapper[4841]: I0313 09:22:00.402543 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm9gj\" (UniqueName: \"kubernetes.io/projected/62838214-0b31-49b3-bac8-7c0c8ce58141-kube-api-access-mm9gj\") pod \"auto-csr-approver-29556562-265fg\" (UID: \"62838214-0b31-49b3-bac8-7c0c8ce58141\") " pod="openshift-infra/auto-csr-approver-29556562-265fg" Mar 13 09:22:00 crc kubenswrapper[4841]: I0313 09:22:00.425474 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm9gj\" (UniqueName: \"kubernetes.io/projected/62838214-0b31-49b3-bac8-7c0c8ce58141-kube-api-access-mm9gj\") pod \"auto-csr-approver-29556562-265fg\" (UID: \"62838214-0b31-49b3-bac8-7c0c8ce58141\") " pod="openshift-infra/auto-csr-approver-29556562-265fg" Mar 13 09:22:00 crc kubenswrapper[4841]: I0313 09:22:00.473800 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556562-265fg" Mar 13 09:22:00 crc kubenswrapper[4841]: I0313 09:22:00.732911 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556562-265fg"] Mar 13 09:22:00 crc kubenswrapper[4841]: I0313 09:22:00.742227 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 09:22:01 crc kubenswrapper[4841]: I0313 09:22:01.707172 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556562-265fg" event={"ID":"62838214-0b31-49b3-bac8-7c0c8ce58141","Type":"ContainerStarted","Data":"f80ec08ac84a11e8a480316f0052188363d64e897c6fe74f2feea4f41469c079"} Mar 13 09:22:02 crc kubenswrapper[4841]: I0313 09:22:02.717495 4841 generic.go:334] "Generic (PLEG): container finished" podID="62838214-0b31-49b3-bac8-7c0c8ce58141" containerID="74280dc591c2a72619e3059ba0d7af8ad907b66aa9b472cf70475f24aee947db" exitCode=0 Mar 13 09:22:02 crc kubenswrapper[4841]: I0313 09:22:02.717606 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556562-265fg" event={"ID":"62838214-0b31-49b3-bac8-7c0c8ce58141","Type":"ContainerDied","Data":"74280dc591c2a72619e3059ba0d7af8ad907b66aa9b472cf70475f24aee947db"} Mar 13 09:22:03 crc kubenswrapper[4841]: I0313 09:22:03.983791 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556562-265fg" Mar 13 09:22:04 crc kubenswrapper[4841]: I0313 09:22:04.149045 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm9gj\" (UniqueName: \"kubernetes.io/projected/62838214-0b31-49b3-bac8-7c0c8ce58141-kube-api-access-mm9gj\") pod \"62838214-0b31-49b3-bac8-7c0c8ce58141\" (UID: \"62838214-0b31-49b3-bac8-7c0c8ce58141\") " Mar 13 09:22:04 crc kubenswrapper[4841]: I0313 09:22:04.157750 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62838214-0b31-49b3-bac8-7c0c8ce58141-kube-api-access-mm9gj" (OuterVolumeSpecName: "kube-api-access-mm9gj") pod "62838214-0b31-49b3-bac8-7c0c8ce58141" (UID: "62838214-0b31-49b3-bac8-7c0c8ce58141"). InnerVolumeSpecName "kube-api-access-mm9gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:22:04 crc kubenswrapper[4841]: I0313 09:22:04.252171 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm9gj\" (UniqueName: \"kubernetes.io/projected/62838214-0b31-49b3-bac8-7c0c8ce58141-kube-api-access-mm9gj\") on node \"crc\" DevicePath \"\"" Mar 13 09:22:04 crc kubenswrapper[4841]: I0313 09:22:04.407302 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:22:04 crc kubenswrapper[4841]: I0313 09:22:04.407386 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:22:04 crc kubenswrapper[4841]: I0313 09:22:04.731601 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556562-265fg" event={"ID":"62838214-0b31-49b3-bac8-7c0c8ce58141","Type":"ContainerDied","Data":"f80ec08ac84a11e8a480316f0052188363d64e897c6fe74f2feea4f41469c079"} Mar 13 09:22:04 crc kubenswrapper[4841]: I0313 09:22:04.731874 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f80ec08ac84a11e8a480316f0052188363d64e897c6fe74f2feea4f41469c079" Mar 13 09:22:04 crc kubenswrapper[4841]: I0313 09:22:04.731687 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556562-265fg" Mar 13 09:22:05 crc kubenswrapper[4841]: I0313 09:22:05.054128 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556556-c9nft"] Mar 13 09:22:05 crc kubenswrapper[4841]: I0313 09:22:05.061439 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556556-c9nft"] Mar 13 09:22:06 crc kubenswrapper[4841]: I0313 09:22:06.005786 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5961bba-4ec3-4b4e-b5a2-73aa1024326e" path="/var/lib/kubelet/pods/f5961bba-4ec3-4b4e-b5a2-73aa1024326e/volumes" Mar 13 09:22:34 crc kubenswrapper[4841]: I0313 09:22:34.407746 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:22:34 crc kubenswrapper[4841]: I0313 09:22:34.408593 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:22:58 crc kubenswrapper[4841]: I0313 09:22:58.311661 4841 scope.go:117] "RemoveContainer" containerID="a4578955e23a6cbfb66acd450dda9a49c3a4c8ee399f72496ce245dadaa5980d" Mar 13 09:22:58 crc kubenswrapper[4841]: I0313 09:22:58.363972 4841 scope.go:117] "RemoveContainer" containerID="dc35df52e1205baf6d56760de75613f95cc272c8ed10cb53c388ebe487625b8a" Mar 13 09:23:04 crc kubenswrapper[4841]: I0313 09:23:04.407368 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:23:04 crc kubenswrapper[4841]: I0313 09:23:04.407875 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:23:04 crc kubenswrapper[4841]: I0313 09:23:04.407957 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:23:04 crc kubenswrapper[4841]: I0313 09:23:04.409019 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"379cc8a3a48d5aedfd454910bea9e163183fcf7244dc7566a308379fd2d7c084"} pod="openshift-machine-config-operator/machine-config-daemon-h227v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 09:23:04 crc kubenswrapper[4841]: I0313 09:23:04.409154 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" containerID="cri-o://379cc8a3a48d5aedfd454910bea9e163183fcf7244dc7566a308379fd2d7c084" gracePeriod=600 Mar 13 09:23:05 crc kubenswrapper[4841]: I0313 09:23:05.154174 4841 generic.go:334] "Generic (PLEG): container finished" podID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerID="379cc8a3a48d5aedfd454910bea9e163183fcf7244dc7566a308379fd2d7c084" exitCode=0 Mar 13 09:23:05 crc kubenswrapper[4841]: I0313 09:23:05.154242 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerDied","Data":"379cc8a3a48d5aedfd454910bea9e163183fcf7244dc7566a308379fd2d7c084"} Mar 13 09:23:05 crc kubenswrapper[4841]: I0313 09:23:05.154537 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"5ed3bccb1da12fcd7dcfabd48b6eee04f275c5f16e821a0e4d8dce433f764913"} Mar 13 09:23:05 crc kubenswrapper[4841]: I0313 09:23:05.154566 4841 scope.go:117] "RemoveContainer" containerID="d65f6d8d4c1b4e29b21c72e57a4e091f2bf10ce1c458bb3c65c2a8ccaf3e6167" Mar 13 09:23:58 crc kubenswrapper[4841]: I0313 09:23:58.426545 4841 scope.go:117] "RemoveContainer" containerID="2115c835145a35f0cb379270a35a3ee84e44aa6aea00ee7bd8b867721adc54a7" Mar 13 09:24:00 crc kubenswrapper[4841]: I0313 09:24:00.147185 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556564-szmt2"] Mar 13 09:24:00 crc kubenswrapper[4841]: E0313 09:24:00.147473 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62838214-0b31-49b3-bac8-7c0c8ce58141" containerName="oc" Mar 13 09:24:00 crc kubenswrapper[4841]: I0313 09:24:00.147546 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="62838214-0b31-49b3-bac8-7c0c8ce58141" containerName="oc" Mar 13 09:24:00 crc kubenswrapper[4841]: I0313 09:24:00.147717 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="62838214-0b31-49b3-bac8-7c0c8ce58141" containerName="oc" Mar 13 09:24:00 crc kubenswrapper[4841]: I0313 09:24:00.148159 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556564-szmt2" Mar 13 09:24:00 crc kubenswrapper[4841]: I0313 09:24:00.150948 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:24:00 crc kubenswrapper[4841]: I0313 09:24:00.151205 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:24:00 crc kubenswrapper[4841]: I0313 09:24:00.151392 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:24:00 crc kubenswrapper[4841]: I0313 09:24:00.160357 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556564-szmt2"] Mar 13 09:24:00 crc kubenswrapper[4841]: I0313 09:24:00.342233 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvqgj\" (UniqueName: \"kubernetes.io/projected/8c7f9fc4-2398-4146-8d0a-056715def92f-kube-api-access-zvqgj\") pod \"auto-csr-approver-29556564-szmt2\" (UID: \"8c7f9fc4-2398-4146-8d0a-056715def92f\") " pod="openshift-infra/auto-csr-approver-29556564-szmt2" Mar 13 09:24:00 crc kubenswrapper[4841]: I0313 09:24:00.443210 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvqgj\" (UniqueName: \"kubernetes.io/projected/8c7f9fc4-2398-4146-8d0a-056715def92f-kube-api-access-zvqgj\") pod \"auto-csr-approver-29556564-szmt2\" (UID: \"8c7f9fc4-2398-4146-8d0a-056715def92f\") " pod="openshift-infra/auto-csr-approver-29556564-szmt2" Mar 13 09:24:00 crc kubenswrapper[4841]: I0313 09:24:00.477980 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvqgj\" (UniqueName: \"kubernetes.io/projected/8c7f9fc4-2398-4146-8d0a-056715def92f-kube-api-access-zvqgj\") pod \"auto-csr-approver-29556564-szmt2\" (UID: \"8c7f9fc4-2398-4146-8d0a-056715def92f\") " pod="openshift-infra/auto-csr-approver-29556564-szmt2" Mar 13 09:24:00 crc kubenswrapper[4841]: I0313 09:24:00.769727 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556564-szmt2" Mar 13 09:24:00 crc kubenswrapper[4841]: I0313 09:24:00.954619 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556564-szmt2"] Mar 13 09:24:01 crc kubenswrapper[4841]: I0313 09:24:01.561953 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556564-szmt2" event={"ID":"8c7f9fc4-2398-4146-8d0a-056715def92f","Type":"ContainerStarted","Data":"9d524c739d711db83503ef344283797e7073c30cae01cf768aacac09445adf93"} Mar 13 09:24:02 crc kubenswrapper[4841]: I0313 09:24:02.569557 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556564-szmt2" event={"ID":"8c7f9fc4-2398-4146-8d0a-056715def92f","Type":"ContainerStarted","Data":"79aa8c15393be9c9864827208121f40731484becd9d763c2e0cd89268204c3d3"} Mar 13 09:24:02 crc kubenswrapper[4841]: I0313 09:24:02.582097 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556564-szmt2" podStartSLOduration=1.400124908 podStartE2EDuration="2.582074307s" podCreationTimestamp="2026-03-13 09:24:00 +0000 UTC" firstStartedPulling="2026-03-13 09:24:00.964113821 +0000 UTC m=+723.694014022" lastFinishedPulling="2026-03-13 09:24:02.14606323 +0000 UTC m=+724.875963421" observedRunningTime="2026-03-13 09:24:02.581830909 +0000 UTC m=+725.311731120" watchObservedRunningTime="2026-03-13 09:24:02.582074307 +0000 UTC m=+725.311974488" Mar 13 09:24:03 crc kubenswrapper[4841]: I0313 09:24:03.579696 4841 generic.go:334] "Generic (PLEG): container finished" podID="8c7f9fc4-2398-4146-8d0a-056715def92f" containerID="79aa8c15393be9c9864827208121f40731484becd9d763c2e0cd89268204c3d3" exitCode=0 Mar 13 09:24:03 crc kubenswrapper[4841]: I0313 09:24:03.579813 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556564-szmt2" event={"ID":"8c7f9fc4-2398-4146-8d0a-056715def92f","Type":"ContainerDied","Data":"79aa8c15393be9c9864827208121f40731484becd9d763c2e0cd89268204c3d3"} Mar 13 09:24:04 crc kubenswrapper[4841]: I0313 09:24:04.910441 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556564-szmt2" Mar 13 09:24:05 crc kubenswrapper[4841]: I0313 09:24:05.020918 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvqgj\" (UniqueName: \"kubernetes.io/projected/8c7f9fc4-2398-4146-8d0a-056715def92f-kube-api-access-zvqgj\") pod \"8c7f9fc4-2398-4146-8d0a-056715def92f\" (UID: \"8c7f9fc4-2398-4146-8d0a-056715def92f\") " Mar 13 09:24:05 crc kubenswrapper[4841]: I0313 09:24:05.028882 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c7f9fc4-2398-4146-8d0a-056715def92f-kube-api-access-zvqgj" (OuterVolumeSpecName: "kube-api-access-zvqgj") pod "8c7f9fc4-2398-4146-8d0a-056715def92f" (UID: "8c7f9fc4-2398-4146-8d0a-056715def92f"). InnerVolumeSpecName "kube-api-access-zvqgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:24:05 crc kubenswrapper[4841]: I0313 09:24:05.123716 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvqgj\" (UniqueName: \"kubernetes.io/projected/8c7f9fc4-2398-4146-8d0a-056715def92f-kube-api-access-zvqgj\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:05 crc kubenswrapper[4841]: I0313 09:24:05.600571 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556564-szmt2" event={"ID":"8c7f9fc4-2398-4146-8d0a-056715def92f","Type":"ContainerDied","Data":"9d524c739d711db83503ef344283797e7073c30cae01cf768aacac09445adf93"} Mar 13 09:24:05 crc kubenswrapper[4841]: I0313 09:24:05.600901 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d524c739d711db83503ef344283797e7073c30cae01cf768aacac09445adf93" Mar 13 09:24:05 crc kubenswrapper[4841]: I0313 09:24:05.600604 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556564-szmt2" Mar 13 09:24:05 crc kubenswrapper[4841]: I0313 09:24:05.635162 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556558-4xbwp"] Mar 13 09:24:05 crc kubenswrapper[4841]: I0313 09:24:05.637911 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556558-4xbwp"] Mar 13 09:24:06 crc kubenswrapper[4841]: I0313 09:24:06.008942 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c0a4d41-6264-4da2-a95c-2d94044862a0" path="/var/lib/kubelet/pods/5c0a4d41-6264-4da2-a95c-2d94044862a0/volumes" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.379286 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7j9c9"] Mar 13 09:24:40 crc kubenswrapper[4841]: E0313 09:24:40.380054 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7f9fc4-2398-4146-8d0a-056715def92f" containerName="oc" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.380069 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7f9fc4-2398-4146-8d0a-056715def92f" containerName="oc" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.380192 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7f9fc4-2398-4146-8d0a-056715def92f" containerName="oc" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.380685 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.409452 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7j9c9"] Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.521746 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-trusted-ca\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.521797 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-bound-sa-token\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.521894 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-registry-tls\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.521942 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.521960 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.522035 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-registry-certificates\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.522068 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.522098 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqlcn\" (UniqueName: \"kubernetes.io/projected/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-kube-api-access-rqlcn\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.564779 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.622743 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-registry-tls\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.622788 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.622820 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-registry-certificates\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.622839 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.622862 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqlcn\" (UniqueName: \"kubernetes.io/projected/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-kube-api-access-rqlcn\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.622895 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-trusted-ca\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.622913 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-bound-sa-token\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.623386 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.624359 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-trusted-ca\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.624956 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-registry-certificates\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.631932 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.632135 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-registry-tls\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.640588 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-bound-sa-token\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.644495 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqlcn\" (UniqueName: \"kubernetes.io/projected/7c3252be-9fc4-4ae4-b253-4d714cf61ab7-kube-api-access-rqlcn\") pod \"image-registry-66df7c8f76-7j9c9\" (UID: \"7c3252be-9fc4-4ae4-b253-4d714cf61ab7\") " pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.698348 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.897921 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-dqzmg"] Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.898894 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dqzmg" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.902707 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.902811 4841 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-4b6jt" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.902727 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.913372 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-crfbh"] Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.914155 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-crfbh" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.915871 4841 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-84rrt" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.921361 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-dqzmg"] Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.929581 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb2kc\" (UniqueName: \"kubernetes.io/projected/b71327b9-8538-404b-b37d-cfb16da13ce4-kube-api-access-cb2kc\") pod \"cert-manager-cainjector-cf98fcc89-dqzmg\" (UID: \"b71327b9-8538-404b-b37d-cfb16da13ce4\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-dqzmg" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.929708 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dfh8\" (UniqueName: \"kubernetes.io/projected/0f26e065-6f9d-4f61-a645-ea11d7f0eb85-kube-api-access-4dfh8\") pod \"cert-manager-858654f9db-crfbh\" (UID: \"0f26e065-6f9d-4f61-a645-ea11d7f0eb85\") " pod="cert-manager/cert-manager-858654f9db-crfbh" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.932260 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7j9c9"] Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.937482 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vqq7j"] Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.938351 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-vqq7j" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.940406 4841 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-7tz9w" Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.958908 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-crfbh"] Mar 13 09:24:40 crc kubenswrapper[4841]: I0313 09:24:40.966821 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vqq7j"] Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.030635 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb2kc\" (UniqueName: \"kubernetes.io/projected/b71327b9-8538-404b-b37d-cfb16da13ce4-kube-api-access-cb2kc\") pod \"cert-manager-cainjector-cf98fcc89-dqzmg\" (UID: \"b71327b9-8538-404b-b37d-cfb16da13ce4\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-dqzmg" Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.030825 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dfh8\" (UniqueName: \"kubernetes.io/projected/0f26e065-6f9d-4f61-a645-ea11d7f0eb85-kube-api-access-4dfh8\") pod \"cert-manager-858654f9db-crfbh\" (UID: \"0f26e065-6f9d-4f61-a645-ea11d7f0eb85\") " pod="cert-manager/cert-manager-858654f9db-crfbh" Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.030889 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsk9m\" (UniqueName: \"kubernetes.io/projected/318e486a-97f3-45fb-84b7-816009810d33-kube-api-access-xsk9m\") pod \"cert-manager-webhook-687f57d79b-vqq7j\" (UID: \"318e486a-97f3-45fb-84b7-816009810d33\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vqq7j" Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.048999 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb2kc\" (UniqueName: \"kubernetes.io/projected/b71327b9-8538-404b-b37d-cfb16da13ce4-kube-api-access-cb2kc\") pod \"cert-manager-cainjector-cf98fcc89-dqzmg\" (UID: \"b71327b9-8538-404b-b37d-cfb16da13ce4\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-dqzmg" Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.049055 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dfh8\" (UniqueName: \"kubernetes.io/projected/0f26e065-6f9d-4f61-a645-ea11d7f0eb85-kube-api-access-4dfh8\") pod \"cert-manager-858654f9db-crfbh\" (UID: \"0f26e065-6f9d-4f61-a645-ea11d7f0eb85\") " pod="cert-manager/cert-manager-858654f9db-crfbh" Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.132338 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsk9m\" (UniqueName: \"kubernetes.io/projected/318e486a-97f3-45fb-84b7-816009810d33-kube-api-access-xsk9m\") pod \"cert-manager-webhook-687f57d79b-vqq7j\" (UID: \"318e486a-97f3-45fb-84b7-816009810d33\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vqq7j" Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.149979 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsk9m\" (UniqueName: \"kubernetes.io/projected/318e486a-97f3-45fb-84b7-816009810d33-kube-api-access-xsk9m\") pod \"cert-manager-webhook-687f57d79b-vqq7j\" (UID: \"318e486a-97f3-45fb-84b7-816009810d33\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vqq7j" Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.225686 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dqzmg" Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.233095 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-crfbh" Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.257261 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-vqq7j" Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.461623 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-dqzmg"] Mar 13 09:24:41 crc kubenswrapper[4841]: W0313 09:24:41.475791 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb71327b9_8538_404b_b37d_cfb16da13ce4.slice/crio-d9cad2c13a318c87017e2ef9878a8a9e8aeb592582be292d7b1609076fc95eca WatchSource:0}: Error finding container d9cad2c13a318c87017e2ef9878a8a9e8aeb592582be292d7b1609076fc95eca: Status 404 returned error can't find the container with id d9cad2c13a318c87017e2ef9878a8a9e8aeb592582be292d7b1609076fc95eca Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.495123 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-crfbh"] Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.535840 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vqq7j"] Mar 13 09:24:41 crc kubenswrapper[4841]: W0313 09:24:41.538103 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod318e486a_97f3_45fb_84b7_816009810d33.slice/crio-3896bfd3bdac738eadad47446dd5b974d184c67764f874d2892caff545171294 WatchSource:0}: Error finding container 3896bfd3bdac738eadad47446dd5b974d184c67764f874d2892caff545171294: Status 404 returned error can't find the container with id 3896bfd3bdac738eadad47446dd5b974d184c67764f874d2892caff545171294 Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.850594 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" event={"ID":"7c3252be-9fc4-4ae4-b253-4d714cf61ab7","Type":"ContainerStarted","Data":"91309441c87dbfb9ddf2fd966be539cac036ca0cba349cf3da2c1657ea481e43"} Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.850637 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" event={"ID":"7c3252be-9fc4-4ae4-b253-4d714cf61ab7","Type":"ContainerStarted","Data":"1aadd614a1108296e15756f57c52a15eb3e0227cabba7f1a97ebc0750bd499b2"} Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.852157 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dqzmg" event={"ID":"b71327b9-8538-404b-b37d-cfb16da13ce4","Type":"ContainerStarted","Data":"d9cad2c13a318c87017e2ef9878a8a9e8aeb592582be292d7b1609076fc95eca"} Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.853584 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-vqq7j" event={"ID":"318e486a-97f3-45fb-84b7-816009810d33","Type":"ContainerStarted","Data":"3896bfd3bdac738eadad47446dd5b974d184c67764f874d2892caff545171294"} Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.854982 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-crfbh" event={"ID":"0f26e065-6f9d-4f61-a645-ea11d7f0eb85","Type":"ContainerStarted","Data":"37a9c5168b39e4cc4f651ed7de2417df3c88b05e6e227d6a8b653b9142fadb65"} Mar 13 09:24:41 crc kubenswrapper[4841]: I0313 09:24:41.870152 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" podStartSLOduration=1.870133362 podStartE2EDuration="1.870133362s" podCreationTimestamp="2026-03-13 09:24:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:24:41.867296903 +0000 UTC m=+764.597197084" watchObservedRunningTime="2026-03-13 09:24:41.870133362 +0000 UTC m=+764.600033573" Mar 13 09:24:42 crc kubenswrapper[4841]: I0313 09:24:42.860618 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:24:44 crc kubenswrapper[4841]: I0313 09:24:44.873288 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-crfbh" event={"ID":"0f26e065-6f9d-4f61-a645-ea11d7f0eb85","Type":"ContainerStarted","Data":"e9620b70e6d04bd203a1627838bdb31826cf3eb5896d6b874223fa16fa523bbb"} Mar 13 09:24:44 crc kubenswrapper[4841]: I0313 09:24:44.874700 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dqzmg" event={"ID":"b71327b9-8538-404b-b37d-cfb16da13ce4","Type":"ContainerStarted","Data":"8174d23d1fcd74258d1f725bb0063369871c0234a040a207352093b1728691db"} Mar 13 09:24:44 crc kubenswrapper[4841]: I0313 09:24:44.876176 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-vqq7j" event={"ID":"318e486a-97f3-45fb-84b7-816009810d33","Type":"ContainerStarted","Data":"8a052048aa313d2bf16c467aa566e1f506b0b15a7e8105b5f6e82bb82490a4eb"} Mar 13 09:24:44 crc kubenswrapper[4841]: I0313 09:24:44.876318 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-vqq7j" Mar 13 09:24:44 crc kubenswrapper[4841]: I0313 09:24:44.941952 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-crfbh" podStartSLOduration=1.970075455 podStartE2EDuration="4.941931681s" podCreationTimestamp="2026-03-13 09:24:40 +0000 UTC" firstStartedPulling="2026-03-13 09:24:41.509589981 +0000 UTC m=+764.239490162" lastFinishedPulling="2026-03-13 09:24:44.481446187 +0000 UTC m=+767.211346388" observedRunningTime="2026-03-13 09:24:44.917310779 +0000 UTC m=+767.647210970" watchObservedRunningTime="2026-03-13 09:24:44.941931681 +0000 UTC m=+767.671831872" Mar 13 09:24:44 crc kubenswrapper[4841]: I0313 09:24:44.943447 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dqzmg" podStartSLOduration=1.952120863 podStartE2EDuration="4.943438708s" podCreationTimestamp="2026-03-13 09:24:40 +0000 UTC" firstStartedPulling="2026-03-13 09:24:41.477352291 +0000 UTC m=+764.207252482" lastFinishedPulling="2026-03-13 09:24:44.468670116 +0000 UTC m=+767.198570327" observedRunningTime="2026-03-13 09:24:44.939192115 +0000 UTC m=+767.669092316" watchObservedRunningTime="2026-03-13 09:24:44.943438708 +0000 UTC m=+767.673338899" Mar 13 09:24:44 crc kubenswrapper[4841]: I0313 09:24:44.960096 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-vqq7j" podStartSLOduration=2.021342743 podStartE2EDuration="4.96007798s" podCreationTimestamp="2026-03-13 09:24:40 +0000 UTC" firstStartedPulling="2026-03-13 09:24:41.540070396 +0000 UTC m=+764.269970587" lastFinishedPulling="2026-03-13 09:24:44.478805633 +0000 UTC m=+767.208705824" observedRunningTime="2026-03-13 09:24:44.955577698 +0000 UTC m=+767.685477899" watchObservedRunningTime="2026-03-13 09:24:44.96007798 +0000 UTC m=+767.689978171" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.070210 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j5szf"] Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.070936 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovn-controller" containerID="cri-o://7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f" gracePeriod=30 Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.071002 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="nbdb" containerID="cri-o://5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c" gracePeriod=30 Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.071055 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovn-acl-logging" containerID="cri-o://5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14" gracePeriod=30 Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.071125 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="kube-rbac-proxy-node" containerID="cri-o://eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2" gracePeriod=30 Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.071198 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="northd" containerID="cri-o://96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82" gracePeriod=30 Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.071256 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6" gracePeriod=30 Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.071173 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="sbdb" containerID="cri-o://b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407" gracePeriod=30 Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.140450 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovnkube-controller" containerID="cri-o://7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a" gracePeriod=30 Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.261069 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-vqq7j" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.428592 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovnkube-controller/3.log" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.432680 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovn-acl-logging/0.log" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.433490 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovn-controller/0.log" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.434401 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.508360 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-89ml9"] Mar 13 09:24:51 crc kubenswrapper[4841]: E0313 09:24:51.508772 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="nbdb" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.508810 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="nbdb" Mar 13 09:24:51 crc kubenswrapper[4841]: E0313 09:24:51.509257 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovnkube-controller" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.509310 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovnkube-controller" Mar 13 09:24:51 crc kubenswrapper[4841]: E0313 09:24:51.509328 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.509342 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 09:24:51 crc kubenswrapper[4841]: E0313 09:24:51.509362 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovnkube-controller" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.509374 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovnkube-controller" Mar 13 09:24:51 crc kubenswrapper[4841]: E0313 09:24:51.509390 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovnkube-controller" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.509402 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovnkube-controller" Mar 13 09:24:51 crc kubenswrapper[4841]: E0313 09:24:51.509415 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="kube-rbac-proxy-node" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.509427 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="kube-rbac-proxy-node" Mar 13 09:24:51 crc kubenswrapper[4841]: E0313 09:24:51.509446 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovn-controller" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.509458 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovn-controller" Mar 13 09:24:51 crc kubenswrapper[4841]: E0313 09:24:51.509514 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="sbdb" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.509528 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="sbdb" Mar 13 09:24:51 crc kubenswrapper[4841]: E0313 09:24:51.509547 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovn-acl-logging" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.509559 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovn-acl-logging" Mar 13 09:24:51 crc kubenswrapper[4841]: E0313 09:24:51.509578 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="northd" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.509590 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="northd" Mar 13 09:24:51 crc kubenswrapper[4841]: E0313 09:24:51.509605 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="kubecfg-setup" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.509617 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="kubecfg-setup" Mar 13 09:24:51 crc kubenswrapper[4841]: E0313 09:24:51.509634 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovnkube-controller" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.509646 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovnkube-controller" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.510023 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovnkube-controller" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.510046 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovnkube-controller" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.510061 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovn-controller" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.510081 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="northd" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.510104 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="kube-rbac-proxy-node" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.510117 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovn-acl-logging" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.510133 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="sbdb" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.510153 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.510167 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="nbdb" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.510188 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovnkube-controller" Mar 13 09:24:51 crc kubenswrapper[4841]: E0313 09:24:51.510398 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovnkube-controller" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.510415 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovnkube-controller" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.511645 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovnkube-controller" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.511687 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerName="ovnkube-controller" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.515153 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523205 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-var-lib-cni-networks-ovn-kubernetes\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523357 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523407 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db500a1d-2be8-49c1-9c9e-af7623d16b15-env-overrides\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523459 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-run-netns\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523492 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-kubelet\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523526 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-run-systemd\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523547 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523548 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523573 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-slash\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523618 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-slash" (OuterVolumeSpecName: "host-slash") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523698 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-run-ovn-kubernetes\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523751 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db500a1d-2be8-49c1-9c9e-af7623d16b15-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523764 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvg6k\" (UniqueName: \"kubernetes.io/projected/db500a1d-2be8-49c1-9c9e-af7623d16b15-kube-api-access-qvg6k\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523795 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523818 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db500a1d-2be8-49c1-9c9e-af7623d16b15-ovnkube-script-lib\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523877 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-etc-openvswitch\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523911 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-node-log\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523953 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-systemd-units\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.523973 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524005 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-node-log" (OuterVolumeSpecName: "node-log") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524023 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-log-socket\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524099 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db500a1d-2be8-49c1-9c9e-af7623d16b15-ovnkube-config\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524007 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524054 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-log-socket" (OuterVolumeSpecName: "log-socket") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524169 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-var-lib-openvswitch\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524247 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-cni-bin\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524278 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524311 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-cni-netd\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524363 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524369 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-run-ovn\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524379 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524457 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524473 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db500a1d-2be8-49c1-9c9e-af7623d16b15-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524505 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524490 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-run-openvswitch\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524575 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db500a1d-2be8-49c1-9c9e-af7623d16b15-ovn-node-metrics-cert\") pod \"db500a1d-2be8-49c1-9c9e-af7623d16b15\" (UID: \"db500a1d-2be8-49c1-9c9e-af7623d16b15\") " Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524596 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db500a1d-2be8-49c1-9c9e-af7623d16b15-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524893 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f355970b-b58f-40a2-82c0-739f1d9520bd-ovnkube-config\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.524986 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-run-systemd\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.525031 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-run-ovn-kubernetes\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.525154 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-log-socket\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.525236 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.525417 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txsrr\" (UniqueName: \"kubernetes.io/projected/f355970b-b58f-40a2-82c0-739f1d9520bd-kube-api-access-txsrr\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.525472 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f355970b-b58f-40a2-82c0-739f1d9520bd-ovnkube-script-lib\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.525527 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-systemd-units\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.525579 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-node-log\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.525625 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-cni-netd\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.525692 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-slash\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.525760 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f355970b-b58f-40a2-82c0-739f1d9520bd-env-overrides\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.525834 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-run-netns\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.525903 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-run-ovn\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.525921 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f355970b-b58f-40a2-82c0-739f1d9520bd-ovn-node-metrics-cert\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.525946 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-cni-bin\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.525966 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-var-lib-openvswitch\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.525985 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-run-openvswitch\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526012 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-kubelet\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526035 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-etc-openvswitch\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526183 4841 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526218 4841 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526244 4841 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526261 4841 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526572 4841 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526593 4841 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db500a1d-2be8-49c1-9c9e-af7623d16b15-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526611 4841 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526629 4841 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526645 4841 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-slash\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526662 4841 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526680 4841 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db500a1d-2be8-49c1-9c9e-af7623d16b15-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526700 4841 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526719 4841 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-node-log\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526741 4841 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526764 4841 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-log-socket\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526837 4841 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db500a1d-2be8-49c1-9c9e-af7623d16b15-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.526860 4841 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.533509 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db500a1d-2be8-49c1-9c9e-af7623d16b15-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.535962 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db500a1d-2be8-49c1-9c9e-af7623d16b15-kube-api-access-qvg6k" (OuterVolumeSpecName: "kube-api-access-qvg6k") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "kube-api-access-qvg6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.540327 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "db500a1d-2be8-49c1-9c9e-af7623d16b15" (UID: "db500a1d-2be8-49c1-9c9e-af7623d16b15"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.628513 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f355970b-b58f-40a2-82c0-739f1d9520bd-env-overrides\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.628602 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-run-netns\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.628655 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-run-ovn\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.628693 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f355970b-b58f-40a2-82c0-739f1d9520bd-ovn-node-metrics-cert\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.628750 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-cni-bin\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.628772 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-run-netns\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.628772 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-run-ovn\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.628781 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-var-lib-openvswitch\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.628927 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-var-lib-openvswitch\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.628956 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-run-openvswitch\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.628997 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-cni-bin\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629132 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-run-openvswitch\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629220 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-kubelet\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629321 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-etc-openvswitch\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629395 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-run-systemd\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629429 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f355970b-b58f-40a2-82c0-739f1d9520bd-ovnkube-config\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629467 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-run-ovn-kubernetes\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629490 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-etc-openvswitch\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629514 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-log-socket\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629539 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f355970b-b58f-40a2-82c0-739f1d9520bd-env-overrides\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629536 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-kubelet\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629604 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629615 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-run-ovn-kubernetes\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629560 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629651 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-run-systemd\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629667 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-log-socket\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629673 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txsrr\" (UniqueName: \"kubernetes.io/projected/f355970b-b58f-40a2-82c0-739f1d9520bd-kube-api-access-txsrr\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629713 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f355970b-b58f-40a2-82c0-739f1d9520bd-ovnkube-script-lib\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629754 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-systemd-units\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629791 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-node-log\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629821 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-cni-netd\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629865 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-slash\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629942 4841 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db500a1d-2be8-49c1-9c9e-af7623d16b15-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629965 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvg6k\" (UniqueName: \"kubernetes.io/projected/db500a1d-2be8-49c1-9c9e-af7623d16b15-kube-api-access-qvg6k\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.629986 4841 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db500a1d-2be8-49c1-9c9e-af7623d16b15-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.630028 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-slash\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.630186 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f355970b-b58f-40a2-82c0-739f1d9520bd-ovnkube-config\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.630210 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-node-log\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.630300 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-systemd-units\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.630344 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f355970b-b58f-40a2-82c0-739f1d9520bd-host-cni-netd\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.631646 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f355970b-b58f-40a2-82c0-739f1d9520bd-ovnkube-script-lib\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.632144 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f355970b-b58f-40a2-82c0-739f1d9520bd-ovn-node-metrics-cert\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.652198 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txsrr\" (UniqueName: \"kubernetes.io/projected/f355970b-b58f-40a2-82c0-739f1d9520bd-kube-api-access-txsrr\") pod \"ovnkube-node-89ml9\" (UID: \"f355970b-b58f-40a2-82c0-739f1d9520bd\") " pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.845340 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.924466 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovnkube-controller/3.log" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.926858 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovn-acl-logging/0.log" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.927509 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j5szf_db500a1d-2be8-49c1-9c9e-af7623d16b15/ovn-controller/0.log" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.927913 4841 generic.go:334] "Generic (PLEG): container finished" podID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerID="7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a" exitCode=0 Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.927957 4841 generic.go:334] "Generic (PLEG): container finished" podID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerID="b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407" exitCode=0 Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.927979 4841 generic.go:334] "Generic (PLEG): container finished" podID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerID="5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c" exitCode=0 Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.927998 4841 generic.go:334] "Generic (PLEG): container finished" podID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerID="96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82" exitCode=0 Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928022 4841 generic.go:334] "Generic (PLEG): container finished" podID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerID="520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6" exitCode=0 Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928044 4841 generic.go:334] "Generic (PLEG): container finished" podID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerID="eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2" exitCode=0 Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928064 4841 generic.go:334] "Generic (PLEG): container finished" podID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerID="5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14" exitCode=143 Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928086 4841 generic.go:334] "Generic (PLEG): container finished" podID="db500a1d-2be8-49c1-9c9e-af7623d16b15" containerID="7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f" exitCode=143 Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928197 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerDied","Data":"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928247 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerDied","Data":"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928325 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerDied","Data":"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928356 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerDied","Data":"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928381 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerDied","Data":"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928412 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerDied","Data":"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928436 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928457 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928472 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928486 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928501 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928515 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928529 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928542 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928556 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928577 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerDied","Data":"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928600 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928618 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928627 4841 scope.go:117] "RemoveContainer" containerID="7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928632 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928646 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928661 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928675 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928689 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928706 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928719 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928734 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928604 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928755 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerDied","Data":"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928871 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928889 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928903 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928918 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928952 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928968 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928984 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.928998 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.929012 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.929027 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.929052 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j5szf" event={"ID":"db500a1d-2be8-49c1-9c9e-af7623d16b15","Type":"ContainerDied","Data":"745bee14f619dd22807f1c648eeb21f404e12b046757a303a2a093bec2b8def5"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.929076 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.929095 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.929109 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.929123 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.929136 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.929149 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.929163 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.929177 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.929190 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.929203 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.929853 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" event={"ID":"f355970b-b58f-40a2-82c0-739f1d9520bd","Type":"ContainerStarted","Data":"52f2a753db66e4a8add08491cba2c53925517c835d8f05ac5320542b935d42c3"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.934385 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qkpgl_5978189d-b3a2-408c-b09e-c2b3de0a91b0/kube-multus/2.log" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.935415 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qkpgl_5978189d-b3a2-408c-b09e-c2b3de0a91b0/kube-multus/1.log" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.935477 4841 generic.go:334] "Generic (PLEG): container finished" podID="5978189d-b3a2-408c-b09e-c2b3de0a91b0" containerID="ad9a321f2ded40f220538e5e218457fdd4cf438c48ad250ba23f7419284d7970" exitCode=2 Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.935513 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qkpgl" event={"ID":"5978189d-b3a2-408c-b09e-c2b3de0a91b0","Type":"ContainerDied","Data":"ad9a321f2ded40f220538e5e218457fdd4cf438c48ad250ba23f7419284d7970"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.935552 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c399741919b7f6d151285e7dc51f48764336f17ec82afe8c58cb97bcbf9e4d9"} Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.936036 4841 scope.go:117] "RemoveContainer" containerID="ad9a321f2ded40f220538e5e218457fdd4cf438c48ad250ba23f7419284d7970" Mar 13 09:24:51 crc kubenswrapper[4841]: E0313 09:24:51.936197 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-qkpgl_openshift-multus(5978189d-b3a2-408c-b09e-c2b3de0a91b0)\"" pod="openshift-multus/multus-qkpgl" podUID="5978189d-b3a2-408c-b09e-c2b3de0a91b0" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.956483 4841 scope.go:117] "RemoveContainer" containerID="46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf" Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.973548 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j5szf"] Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.977726 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j5szf"] Mar 13 09:24:51 crc kubenswrapper[4841]: I0313 09:24:51.991819 4841 scope.go:117] "RemoveContainer" containerID="b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.001914 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db500a1d-2be8-49c1-9c9e-af7623d16b15" path="/var/lib/kubelet/pods/db500a1d-2be8-49c1-9c9e-af7623d16b15/volumes" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.007546 4841 scope.go:117] "RemoveContainer" containerID="5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.077438 4841 scope.go:117] "RemoveContainer" containerID="96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.089140 4841 scope.go:117] "RemoveContainer" containerID="520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.100146 4841 scope.go:117] "RemoveContainer" containerID="eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.118492 4841 scope.go:117] "RemoveContainer" containerID="5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.135827 4841 scope.go:117] "RemoveContainer" containerID="7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.150038 4841 scope.go:117] "RemoveContainer" containerID="8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.169895 4841 scope.go:117] "RemoveContainer" containerID="7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a" Mar 13 09:24:52 crc kubenswrapper[4841]: E0313 09:24:52.170375 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a\": container with ID starting with 7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a not found: ID does not exist" containerID="7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.170413 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a"} err="failed to get container status \"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a\": rpc error: code = NotFound desc = could not find container \"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a\": container with ID starting with 7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.170432 4841 scope.go:117] "RemoveContainer" containerID="46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf" Mar 13 09:24:52 crc kubenswrapper[4841]: E0313 09:24:52.171557 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf\": container with ID starting with 46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf not found: ID does not exist" containerID="46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.171748 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf"} err="failed to get container status \"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf\": rpc error: code = NotFound desc = could not find container \"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf\": container with ID starting with 46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.171893 4841 scope.go:117] "RemoveContainer" containerID="b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407" Mar 13 09:24:52 crc kubenswrapper[4841]: E0313 09:24:52.172508 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\": container with ID starting with b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407 not found: ID does not exist" containerID="b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.172541 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407"} err="failed to get container status \"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\": rpc error: code = NotFound desc = could not find container \"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\": container with ID starting with b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.172560 4841 scope.go:117] "RemoveContainer" containerID="5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c" Mar 13 09:24:52 crc kubenswrapper[4841]: E0313 09:24:52.172884 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\": container with ID starting with 5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c not found: ID does not exist" containerID="5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.172908 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c"} err="failed to get container status \"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\": rpc error: code = NotFound desc = could not find container \"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\": container with ID starting with 5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.172943 4841 scope.go:117] "RemoveContainer" containerID="96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82" Mar 13 09:24:52 crc kubenswrapper[4841]: E0313 09:24:52.173408 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\": container with ID starting with 96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82 not found: ID does not exist" containerID="96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.173431 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82"} err="failed to get container status \"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\": rpc error: code = NotFound desc = could not find container \"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\": container with ID starting with 96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.173445 4841 scope.go:117] "RemoveContainer" containerID="520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6" Mar 13 09:24:52 crc kubenswrapper[4841]: E0313 09:24:52.173707 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\": container with ID starting with 520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6 not found: ID does not exist" containerID="520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.173727 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6"} err="failed to get container status \"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\": rpc error: code = NotFound desc = could not find container \"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\": container with ID starting with 520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.173738 4841 scope.go:117] "RemoveContainer" containerID="eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2" Mar 13 09:24:52 crc kubenswrapper[4841]: E0313 09:24:52.174040 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\": container with ID starting with eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2 not found: ID does not exist" containerID="eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.174060 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2"} err="failed to get container status \"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\": rpc error: code = NotFound desc = could not find container \"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\": container with ID starting with eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.174071 4841 scope.go:117] "RemoveContainer" containerID="5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14" Mar 13 09:24:52 crc kubenswrapper[4841]: E0313 09:24:52.174355 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\": container with ID starting with 5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14 not found: ID does not exist" containerID="5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.174381 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14"} err="failed to get container status \"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\": rpc error: code = NotFound desc = could not find container \"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\": container with ID starting with 5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.174399 4841 scope.go:117] "RemoveContainer" containerID="7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f" Mar 13 09:24:52 crc kubenswrapper[4841]: E0313 09:24:52.174626 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\": container with ID starting with 7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f not found: ID does not exist" containerID="7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.174670 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f"} err="failed to get container status \"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\": rpc error: code = NotFound desc = could not find container \"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\": container with ID starting with 7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.174690 4841 scope.go:117] "RemoveContainer" containerID="8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc" Mar 13 09:24:52 crc kubenswrapper[4841]: E0313 09:24:52.174929 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\": container with ID starting with 8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc not found: ID does not exist" containerID="8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.174952 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc"} err="failed to get container status \"8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\": rpc error: code = NotFound desc = could not find container \"8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\": container with ID starting with 8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.174965 4841 scope.go:117] "RemoveContainer" containerID="7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.175194 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a"} err="failed to get container status \"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a\": rpc error: code = NotFound desc = could not find container \"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a\": container with ID starting with 7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.175211 4841 scope.go:117] "RemoveContainer" containerID="46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.175550 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf"} err="failed to get container status \"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf\": rpc error: code = NotFound desc = could not find container \"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf\": container with ID starting with 46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.175570 4841 scope.go:117] "RemoveContainer" containerID="b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.175831 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407"} err="failed to get container status \"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\": rpc error: code = NotFound desc = could not find container \"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\": container with ID starting with b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.175846 4841 scope.go:117] "RemoveContainer" containerID="5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.176056 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c"} err="failed to get container status \"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\": rpc error: code = NotFound desc = could not find container \"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\": container with ID starting with 5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.176081 4841 scope.go:117] "RemoveContainer" containerID="96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.176314 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82"} err="failed to get container status \"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\": rpc error: code = NotFound desc = could not find container \"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\": container with ID starting with 96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.176352 4841 scope.go:117] "RemoveContainer" containerID="520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.176632 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6"} err="failed to get container status \"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\": rpc error: code = NotFound desc = could not find container \"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\": container with ID starting with 520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.176652 4841 scope.go:117] "RemoveContainer" containerID="eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.176958 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2"} err="failed to get container status \"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\": rpc error: code = NotFound desc = could not find container \"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\": container with ID starting with eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.176977 4841 scope.go:117] "RemoveContainer" containerID="5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.177178 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14"} err="failed to get container status \"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\": rpc error: code = NotFound desc = could not find container \"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\": container with ID starting with 5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.177194 4841 scope.go:117] "RemoveContainer" containerID="7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.177379 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f"} err="failed to get container status \"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\": rpc error: code = NotFound desc = could not find container \"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\": container with ID starting with 7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.177394 4841 scope.go:117] "RemoveContainer" containerID="8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.177729 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc"} err="failed to get container status \"8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\": rpc error: code = NotFound desc = could not find container \"8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\": container with ID starting with 8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.177751 4841 scope.go:117] "RemoveContainer" containerID="7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.177992 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a"} err="failed to get container status \"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a\": rpc error: code = NotFound desc = could not find container \"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a\": container with ID starting with 7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.178011 4841 scope.go:117] "RemoveContainer" containerID="46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.178213 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf"} err="failed to get container status \"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf\": rpc error: code = NotFound desc = could not find container \"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf\": container with ID starting with 46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.178237 4841 scope.go:117] "RemoveContainer" containerID="b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.178528 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407"} err="failed to get container status \"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\": rpc error: code = NotFound desc = could not find container \"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\": container with ID starting with b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.178546 4841 scope.go:117] "RemoveContainer" containerID="5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.178775 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c"} err="failed to get container status \"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\": rpc error: code = NotFound desc = could not find container \"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\": container with ID starting with 5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.178795 4841 scope.go:117] "RemoveContainer" containerID="96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.179019 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82"} err="failed to get container status \"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\": rpc error: code = NotFound desc = could not find container \"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\": container with ID starting with 96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.179038 4841 scope.go:117] "RemoveContainer" containerID="520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.179524 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6"} err="failed to get container status \"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\": rpc error: code = NotFound desc = could not find container \"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\": container with ID starting with 520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.179547 4841 scope.go:117] "RemoveContainer" containerID="eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.179738 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2"} err="failed to get container status \"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\": rpc error: code = NotFound desc = could not find container \"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\": container with ID starting with eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.179756 4841 scope.go:117] "RemoveContainer" containerID="5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.179945 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14"} err="failed to get container status \"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\": rpc error: code = NotFound desc = could not find container \"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\": container with ID starting with 5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.179963 4841 scope.go:117] "RemoveContainer" containerID="7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.180229 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f"} err="failed to get container status \"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\": rpc error: code = NotFound desc = could not find container \"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\": container with ID starting with 7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.180455 4841 scope.go:117] "RemoveContainer" containerID="8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.180663 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc"} err="failed to get container status \"8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\": rpc error: code = NotFound desc = could not find container \"8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\": container with ID starting with 8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.180684 4841 scope.go:117] "RemoveContainer" containerID="7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.180865 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a"} err="failed to get container status \"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a\": rpc error: code = NotFound desc = could not find container \"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a\": container with ID starting with 7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.180892 4841 scope.go:117] "RemoveContainer" containerID="46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.181173 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf"} err="failed to get container status \"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf\": rpc error: code = NotFound desc = could not find container \"46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf\": container with ID starting with 46fe477b16c01261e872a4a2dafb8e5a71b148747bd35e73fc090c33d1566fcf not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.181197 4841 scope.go:117] "RemoveContainer" containerID="b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.181618 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407"} err="failed to get container status \"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\": rpc error: code = NotFound desc = could not find container \"b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407\": container with ID starting with b8d4cf141faa180e057780d017ce618813aec93712eb43a68835c679dcaae407 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.181643 4841 scope.go:117] "RemoveContainer" containerID="5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.181961 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c"} err="failed to get container status \"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\": rpc error: code = NotFound desc = could not find container \"5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c\": container with ID starting with 5713211e193c47abe119fd89d52d91fc4ce54f59c43d0174350f1d936d5b932c not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.181985 4841 scope.go:117] "RemoveContainer" containerID="96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.182186 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82"} err="failed to get container status \"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\": rpc error: code = NotFound desc = could not find container \"96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82\": container with ID starting with 96ce1210e3cf7e93b7470e0cf6ab8a1c8731d141c8a3a074bdf3225f11cf5a82 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.182207 4841 scope.go:117] "RemoveContainer" containerID="520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.182515 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6"} err="failed to get container status \"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\": rpc error: code = NotFound desc = could not find container \"520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6\": container with ID starting with 520b472f337f3dc1c4dcde3ba8b1c1ceb4c5d39559535e8521f080fcc289e5c6 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.182537 4841 scope.go:117] "RemoveContainer" containerID="eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.182774 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2"} err="failed to get container status \"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\": rpc error: code = NotFound desc = could not find container \"eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2\": container with ID starting with eacd89ace215a4c4db95a3930bf5a04bc32ad2c6e8a1a7c4d15e10d388c075a2 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.182794 4841 scope.go:117] "RemoveContainer" containerID="5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.183125 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14"} err="failed to get container status \"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\": rpc error: code = NotFound desc = could not find container \"5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14\": container with ID starting with 5765027619a4afa69c63af8e6e2672470a59efe374ee41a75c52b1b8053d6a14 not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.183145 4841 scope.go:117] "RemoveContainer" containerID="7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.183466 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f"} err="failed to get container status \"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\": rpc error: code = NotFound desc = could not find container \"7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f\": container with ID starting with 7090ec8a1ca231f076a9f8379ba3c6f9b6ef581d1191ed5041c803462fbbcd8f not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.183490 4841 scope.go:117] "RemoveContainer" containerID="8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.183835 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc"} err="failed to get container status \"8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\": rpc error: code = NotFound desc = could not find container \"8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc\": container with ID starting with 8cef13d10be853c8ba6d254a5fb5e1fa341a225f41f3e1d999e179f6d0c8e6dc not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.183852 4841 scope.go:117] "RemoveContainer" containerID="7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.184090 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a"} err="failed to get container status \"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a\": rpc error: code = NotFound desc = could not find container \"7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a\": container with ID starting with 7cc5501b4d594ded53f13158baca0ce63c0e5a0214148fb68e87bf150af7729a not found: ID does not exist" Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.947019 4841 generic.go:334] "Generic (PLEG): container finished" podID="f355970b-b58f-40a2-82c0-739f1d9520bd" containerID="a5ea369c3c4cf4e7ec42d9095423fccf95b03ca357f35ccddfc76cc5a14d6f9c" exitCode=0 Mar 13 09:24:52 crc kubenswrapper[4841]: I0313 09:24:52.947282 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" event={"ID":"f355970b-b58f-40a2-82c0-739f1d9520bd","Type":"ContainerDied","Data":"a5ea369c3c4cf4e7ec42d9095423fccf95b03ca357f35ccddfc76cc5a14d6f9c"} Mar 13 09:24:53 crc kubenswrapper[4841]: I0313 09:24:53.960391 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" event={"ID":"f355970b-b58f-40a2-82c0-739f1d9520bd","Type":"ContainerStarted","Data":"b266b462b67880a5e44c61604eb23c9f86e0fbcede0d36da81dc9e98035fb3be"} Mar 13 09:24:53 crc kubenswrapper[4841]: I0313 09:24:53.960712 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" event={"ID":"f355970b-b58f-40a2-82c0-739f1d9520bd","Type":"ContainerStarted","Data":"5abfd95a6a7efb049fad8d5248fafe1f440d53cd6a75ea48802b3a38803e83be"} Mar 13 09:24:53 crc kubenswrapper[4841]: I0313 09:24:53.960727 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" event={"ID":"f355970b-b58f-40a2-82c0-739f1d9520bd","Type":"ContainerStarted","Data":"c50659f58d20cfa4540c69ed243fab9e6333c292d505cb1fdcbdf9b8b835409f"} Mar 13 09:24:53 crc kubenswrapper[4841]: I0313 09:24:53.960738 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" event={"ID":"f355970b-b58f-40a2-82c0-739f1d9520bd","Type":"ContainerStarted","Data":"7e84a72d124a607ba7629ecbf670cf7ab60e0b3cce030d77b8fe71ae8c7cafb3"} Mar 13 09:24:53 crc kubenswrapper[4841]: I0313 09:24:53.960750 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" event={"ID":"f355970b-b58f-40a2-82c0-739f1d9520bd","Type":"ContainerStarted","Data":"2e723d8da9e243e9c792593316a5302993c25676ac8ae13a8de3e5c0dbea1977"} Mar 13 09:24:53 crc kubenswrapper[4841]: I0313 09:24:53.960763 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" event={"ID":"f355970b-b58f-40a2-82c0-739f1d9520bd","Type":"ContainerStarted","Data":"572145f1f6ad0b08bbf443bcc5ecdc7c9eb74155dcdcd475b7e6b4419afeb687"} Mar 13 09:24:55 crc kubenswrapper[4841]: I0313 09:24:55.978122 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" event={"ID":"f355970b-b58f-40a2-82c0-739f1d9520bd","Type":"ContainerStarted","Data":"6a35caa7f788b181e3c849e9e93ff42e204418eac2a45a91a1c5d5b7c803201c"} Mar 13 09:24:58 crc kubenswrapper[4841]: I0313 09:24:58.464416 4841 scope.go:117] "RemoveContainer" containerID="1c399741919b7f6d151285e7dc51f48764336f17ec82afe8c58cb97bcbf9e4d9" Mar 13 09:24:59 crc kubenswrapper[4841]: I0313 09:24:59.000683 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" event={"ID":"f355970b-b58f-40a2-82c0-739f1d9520bd","Type":"ContainerStarted","Data":"d4d79edf99f35d9b482cf2b2988fa88347c709d93238a55826178d8860af7b90"} Mar 13 09:24:59 crc kubenswrapper[4841]: I0313 09:24:59.001111 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:59 crc kubenswrapper[4841]: I0313 09:24:59.002782 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qkpgl_5978189d-b3a2-408c-b09e-c2b3de0a91b0/kube-multus/2.log" Mar 13 09:24:59 crc kubenswrapper[4841]: I0313 09:24:59.034795 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:24:59 crc kubenswrapper[4841]: I0313 09:24:59.041171 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" podStartSLOduration=8.041153023 podStartE2EDuration="8.041153023s" podCreationTimestamp="2026-03-13 09:24:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:24:59.037609161 +0000 UTC m=+781.767509362" watchObservedRunningTime="2026-03-13 09:24:59.041153023 +0000 UTC m=+781.771053224" Mar 13 09:25:00 crc kubenswrapper[4841]: I0313 09:25:00.009229 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:25:00 crc kubenswrapper[4841]: I0313 09:25:00.009299 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:25:00 crc kubenswrapper[4841]: I0313 09:25:00.049159 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:25:00 crc kubenswrapper[4841]: I0313 09:25:00.707520 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7j9c9" Mar 13 09:25:00 crc kubenswrapper[4841]: I0313 09:25:00.753842 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mpnlt"] Mar 13 09:25:04 crc kubenswrapper[4841]: I0313 09:25:04.407940 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:25:04 crc kubenswrapper[4841]: I0313 09:25:04.408301 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:25:05 crc kubenswrapper[4841]: I0313 09:25:05.995090 4841 scope.go:117] "RemoveContainer" containerID="ad9a321f2ded40f220538e5e218457fdd4cf438c48ad250ba23f7419284d7970" Mar 13 09:25:05 crc kubenswrapper[4841]: E0313 09:25:05.995810 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-qkpgl_openshift-multus(5978189d-b3a2-408c-b09e-c2b3de0a91b0)\"" pod="openshift-multus/multus-qkpgl" podUID="5978189d-b3a2-408c-b09e-c2b3de0a91b0" Mar 13 09:25:19 crc kubenswrapper[4841]: I0313 09:25:19.995825 4841 scope.go:117] "RemoveContainer" containerID="ad9a321f2ded40f220538e5e218457fdd4cf438c48ad250ba23f7419284d7970" Mar 13 09:25:20 crc kubenswrapper[4841]: I0313 09:25:20.371678 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qkpgl_5978189d-b3a2-408c-b09e-c2b3de0a91b0/kube-multus/2.log" Mar 13 09:25:20 crc kubenswrapper[4841]: I0313 09:25:20.371735 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qkpgl" event={"ID":"5978189d-b3a2-408c-b09e-c2b3de0a91b0","Type":"ContainerStarted","Data":"06552d9c8357526e71b69fde56f88ad284922830246adf9bf5c76f93c540052e"} Mar 13 09:25:21 crc kubenswrapper[4841]: I0313 09:25:21.881320 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-89ml9" Mar 13 09:25:25 crc kubenswrapper[4841]: I0313 09:25:25.768702 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc"] Mar 13 09:25:25 crc kubenswrapper[4841]: I0313 09:25:25.770994 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc" Mar 13 09:25:25 crc kubenswrapper[4841]: I0313 09:25:25.772815 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 09:25:25 crc kubenswrapper[4841]: I0313 09:25:25.779764 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc"] Mar 13 09:25:25 crc kubenswrapper[4841]: I0313 09:25:25.796675 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" podUID="7a2c93c5-5f1b-41d3-92eb-9f91fcc15176" containerName="registry" containerID="cri-o://c4dbedede92f74070b178c3e976c30521923b17f704e27fdb772fd4ca321008f" gracePeriod=30 Mar 13 09:25:25 crc kubenswrapper[4841]: I0313 09:25:25.881158 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5dfr\" (UniqueName: \"kubernetes.io/projected/b11048a2-12d1-437e-80b5-05e10ccc4b50-kube-api-access-t5dfr\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc\" (UID: \"b11048a2-12d1-437e-80b5-05e10ccc4b50\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc" Mar 13 09:25:25 crc kubenswrapper[4841]: I0313 09:25:25.881216 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b11048a2-12d1-437e-80b5-05e10ccc4b50-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc\" (UID: \"b11048a2-12d1-437e-80b5-05e10ccc4b50\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc" Mar 13 09:25:25 crc kubenswrapper[4841]: I0313 09:25:25.881759 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b11048a2-12d1-437e-80b5-05e10ccc4b50-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc\" (UID: \"b11048a2-12d1-437e-80b5-05e10ccc4b50\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc" Mar 13 09:25:25 crc kubenswrapper[4841]: I0313 09:25:25.983681 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b11048a2-12d1-437e-80b5-05e10ccc4b50-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc\" (UID: \"b11048a2-12d1-437e-80b5-05e10ccc4b50\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc" Mar 13 09:25:25 crc kubenswrapper[4841]: I0313 09:25:25.983815 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5dfr\" (UniqueName: \"kubernetes.io/projected/b11048a2-12d1-437e-80b5-05e10ccc4b50-kube-api-access-t5dfr\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc\" (UID: \"b11048a2-12d1-437e-80b5-05e10ccc4b50\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc" Mar 13 09:25:25 crc kubenswrapper[4841]: I0313 09:25:25.983860 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b11048a2-12d1-437e-80b5-05e10ccc4b50-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc\" (UID: \"b11048a2-12d1-437e-80b5-05e10ccc4b50\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc" Mar 13 09:25:25 crc kubenswrapper[4841]: I0313 09:25:25.984859 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b11048a2-12d1-437e-80b5-05e10ccc4b50-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc\" (UID: \"b11048a2-12d1-437e-80b5-05e10ccc4b50\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc" Mar 13 09:25:25 crc kubenswrapper[4841]: I0313 09:25:25.984919 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b11048a2-12d1-437e-80b5-05e10ccc4b50-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc\" (UID: \"b11048a2-12d1-437e-80b5-05e10ccc4b50\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc" Mar 13 09:25:26 crc kubenswrapper[4841]: I0313 09:25:26.011020 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5dfr\" (UniqueName: \"kubernetes.io/projected/b11048a2-12d1-437e-80b5-05e10ccc4b50-kube-api-access-t5dfr\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc\" (UID: \"b11048a2-12d1-437e-80b5-05e10ccc4b50\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc" Mar 13 09:25:26 crc kubenswrapper[4841]: I0313 09:25:26.084842 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc" Mar 13 09:25:26 crc kubenswrapper[4841]: I0313 09:25:26.382468 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc"] Mar 13 09:25:26 crc kubenswrapper[4841]: I0313 09:25:26.410660 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc" event={"ID":"b11048a2-12d1-437e-80b5-05e10ccc4b50","Type":"ContainerStarted","Data":"a225f67456573e3ea2a05057f46679d1dd2459c955852af28c4d8a89d14c4dfc"} Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.355901 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.421704 4841 generic.go:334] "Generic (PLEG): container finished" podID="7a2c93c5-5f1b-41d3-92eb-9f91fcc15176" containerID="c4dbedede92f74070b178c3e976c30521923b17f704e27fdb772fd4ca321008f" exitCode=0 Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.421806 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.421829 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" event={"ID":"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176","Type":"ContainerDied","Data":"c4dbedede92f74070b178c3e976c30521923b17f704e27fdb772fd4ca321008f"} Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.421891 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mpnlt" event={"ID":"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176","Type":"ContainerDied","Data":"05541228cfd207e31b667146d299479dadeb0a7c642586fa78b061cf65d84e7f"} Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.421913 4841 scope.go:117] "RemoveContainer" containerID="c4dbedede92f74070b178c3e976c30521923b17f704e27fdb772fd4ca321008f" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.425076 4841 generic.go:334] "Generic (PLEG): container finished" podID="b11048a2-12d1-437e-80b5-05e10ccc4b50" containerID="d0475c36de301ea82937161dc7de6c33c6886f0b76853fda49770b37ee9c28bf" exitCode=0 Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.425123 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc" event={"ID":"b11048a2-12d1-437e-80b5-05e10ccc4b50","Type":"ContainerDied","Data":"d0475c36de301ea82937161dc7de6c33c6886f0b76853fda49770b37ee9c28bf"} Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.451041 4841 scope.go:117] "RemoveContainer" containerID="c4dbedede92f74070b178c3e976c30521923b17f704e27fdb772fd4ca321008f" Mar 13 09:25:27 crc kubenswrapper[4841]: E0313 09:25:27.451604 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4dbedede92f74070b178c3e976c30521923b17f704e27fdb772fd4ca321008f\": container with ID starting with c4dbedede92f74070b178c3e976c30521923b17f704e27fdb772fd4ca321008f not found: ID does not exist" containerID="c4dbedede92f74070b178c3e976c30521923b17f704e27fdb772fd4ca321008f" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.451650 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4dbedede92f74070b178c3e976c30521923b17f704e27fdb772fd4ca321008f"} err="failed to get container status \"c4dbedede92f74070b178c3e976c30521923b17f704e27fdb772fd4ca321008f\": rpc error: code = NotFound desc = could not find container \"c4dbedede92f74070b178c3e976c30521923b17f704e27fdb772fd4ca321008f\": container with ID starting with c4dbedede92f74070b178c3e976c30521923b17f704e27fdb772fd4ca321008f not found: ID does not exist" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.502101 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2dnx\" (UniqueName: \"kubernetes.io/projected/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-kube-api-access-s2dnx\") pod \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.502178 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-ca-trust-extracted\") pod \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.502225 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-registry-tls\") pod \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.502287 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-installation-pull-secrets\") pod \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.502329 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-bound-sa-token\") pod \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.502613 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.502652 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-registry-certificates\") pod \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.502767 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-trusted-ca\") pod \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\" (UID: \"7a2c93c5-5f1b-41d3-92eb-9f91fcc15176\") " Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.503641 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.503745 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.509062 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.509717 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.509717 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.510135 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-kube-api-access-s2dnx" (OuterVolumeSpecName: "kube-api-access-s2dnx") pod "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176"). InnerVolumeSpecName "kube-api-access-s2dnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.518482 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.526414 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176" (UID: "7a2c93c5-5f1b-41d3-92eb-9f91fcc15176"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.604597 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.604639 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2dnx\" (UniqueName: \"kubernetes.io/projected/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-kube-api-access-s2dnx\") on node \"crc\" DevicePath \"\"" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.604651 4841 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.604663 4841 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.604674 4841 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.604686 4841 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.604699 4841 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.757730 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mpnlt"] Mar 13 09:25:27 crc kubenswrapper[4841]: I0313 09:25:27.763104 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mpnlt"] Mar 13 09:25:28 crc kubenswrapper[4841]: I0313 09:25:28.006739 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a2c93c5-5f1b-41d3-92eb-9f91fcc15176" path="/var/lib/kubelet/pods/7a2c93c5-5f1b-41d3-92eb-9f91fcc15176/volumes" Mar 13 09:25:29 crc kubenswrapper[4841]: I0313 09:25:29.443097 4841 generic.go:334] "Generic (PLEG): container finished" podID="b11048a2-12d1-437e-80b5-05e10ccc4b50" containerID="a35657b39a7cc56f1688dd6feae4803975a0194b2456fb1ac3c1ff1b0e1e0c73" exitCode=0 Mar 13 09:25:29 crc kubenswrapper[4841]: I0313 09:25:29.443151 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc" event={"ID":"b11048a2-12d1-437e-80b5-05e10ccc4b50","Type":"ContainerDied","Data":"a35657b39a7cc56f1688dd6feae4803975a0194b2456fb1ac3c1ff1b0e1e0c73"} Mar 13 09:25:30 crc kubenswrapper[4841]: I0313 09:25:30.455425 4841 generic.go:334] "Generic (PLEG): container finished" podID="b11048a2-12d1-437e-80b5-05e10ccc4b50" containerID="3ec158ca5389bf330e9a36a13809055fb028041a7b7f0636af817c76c78ac13a" exitCode=0 Mar 13 09:25:30 crc kubenswrapper[4841]: I0313 09:25:30.455491 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc" event={"ID":"b11048a2-12d1-437e-80b5-05e10ccc4b50","Type":"ContainerDied","Data":"3ec158ca5389bf330e9a36a13809055fb028041a7b7f0636af817c76c78ac13a"} Mar 13 09:25:31 crc kubenswrapper[4841]: I0313 09:25:31.754386 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc" Mar 13 09:25:31 crc kubenswrapper[4841]: I0313 09:25:31.864252 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5dfr\" (UniqueName: \"kubernetes.io/projected/b11048a2-12d1-437e-80b5-05e10ccc4b50-kube-api-access-t5dfr\") pod \"b11048a2-12d1-437e-80b5-05e10ccc4b50\" (UID: \"b11048a2-12d1-437e-80b5-05e10ccc4b50\") " Mar 13 09:25:31 crc kubenswrapper[4841]: I0313 09:25:31.864452 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b11048a2-12d1-437e-80b5-05e10ccc4b50-bundle\") pod \"b11048a2-12d1-437e-80b5-05e10ccc4b50\" (UID: \"b11048a2-12d1-437e-80b5-05e10ccc4b50\") " Mar 13 09:25:31 crc kubenswrapper[4841]: I0313 09:25:31.864493 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b11048a2-12d1-437e-80b5-05e10ccc4b50-util\") pod \"b11048a2-12d1-437e-80b5-05e10ccc4b50\" (UID: \"b11048a2-12d1-437e-80b5-05e10ccc4b50\") " Mar 13 09:25:31 crc kubenswrapper[4841]: I0313 09:25:31.865618 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11048a2-12d1-437e-80b5-05e10ccc4b50-bundle" (OuterVolumeSpecName: "bundle") pod "b11048a2-12d1-437e-80b5-05e10ccc4b50" (UID: "b11048a2-12d1-437e-80b5-05e10ccc4b50"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:25:31 crc kubenswrapper[4841]: I0313 09:25:31.872114 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11048a2-12d1-437e-80b5-05e10ccc4b50-kube-api-access-t5dfr" (OuterVolumeSpecName: "kube-api-access-t5dfr") pod "b11048a2-12d1-437e-80b5-05e10ccc4b50" (UID: "b11048a2-12d1-437e-80b5-05e10ccc4b50"). InnerVolumeSpecName "kube-api-access-t5dfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:25:31 crc kubenswrapper[4841]: I0313 09:25:31.886529 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11048a2-12d1-437e-80b5-05e10ccc4b50-util" (OuterVolumeSpecName: "util") pod "b11048a2-12d1-437e-80b5-05e10ccc4b50" (UID: "b11048a2-12d1-437e-80b5-05e10ccc4b50"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:25:31 crc kubenswrapper[4841]: I0313 09:25:31.966354 4841 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b11048a2-12d1-437e-80b5-05e10ccc4b50-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:25:31 crc kubenswrapper[4841]: I0313 09:25:31.966410 4841 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b11048a2-12d1-437e-80b5-05e10ccc4b50-util\") on node \"crc\" DevicePath \"\"" Mar 13 09:25:31 crc kubenswrapper[4841]: I0313 09:25:31.966430 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5dfr\" (UniqueName: \"kubernetes.io/projected/b11048a2-12d1-437e-80b5-05e10ccc4b50-kube-api-access-t5dfr\") on node \"crc\" DevicePath \"\"" Mar 13 09:25:32 crc kubenswrapper[4841]: I0313 09:25:32.474344 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc" event={"ID":"b11048a2-12d1-437e-80b5-05e10ccc4b50","Type":"ContainerDied","Data":"a225f67456573e3ea2a05057f46679d1dd2459c955852af28c4d8a89d14c4dfc"} Mar 13 09:25:32 crc kubenswrapper[4841]: I0313 09:25:32.474407 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a225f67456573e3ea2a05057f46679d1dd2459c955852af28c4d8a89d14c4dfc" Mar 13 09:25:32 crc kubenswrapper[4841]: I0313 09:25:32.474428 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc" Mar 13 09:25:34 crc kubenswrapper[4841]: I0313 09:25:34.407638 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:25:34 crc kubenswrapper[4841]: I0313 09:25:34.407691 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:25:34 crc kubenswrapper[4841]: I0313 09:25:34.856909 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-w54ng"] Mar 13 09:25:34 crc kubenswrapper[4841]: E0313 09:25:34.861823 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a2c93c5-5f1b-41d3-92eb-9f91fcc15176" containerName="registry" Mar 13 09:25:34 crc kubenswrapper[4841]: I0313 09:25:34.861848 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a2c93c5-5f1b-41d3-92eb-9f91fcc15176" containerName="registry" Mar 13 09:25:34 crc kubenswrapper[4841]: E0313 09:25:34.861867 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b11048a2-12d1-437e-80b5-05e10ccc4b50" containerName="util" Mar 13 09:25:34 crc kubenswrapper[4841]: I0313 09:25:34.861876 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b11048a2-12d1-437e-80b5-05e10ccc4b50" containerName="util" Mar 13 09:25:34 crc kubenswrapper[4841]: E0313 09:25:34.861890 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b11048a2-12d1-437e-80b5-05e10ccc4b50" containerName="pull" Mar 13 09:25:34 crc kubenswrapper[4841]: I0313 09:25:34.861898 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b11048a2-12d1-437e-80b5-05e10ccc4b50" containerName="pull" Mar 13 09:25:34 crc kubenswrapper[4841]: E0313 09:25:34.861914 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b11048a2-12d1-437e-80b5-05e10ccc4b50" containerName="extract" Mar 13 09:25:34 crc kubenswrapper[4841]: I0313 09:25:34.861922 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b11048a2-12d1-437e-80b5-05e10ccc4b50" containerName="extract" Mar 13 09:25:34 crc kubenswrapper[4841]: I0313 09:25:34.862059 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a2c93c5-5f1b-41d3-92eb-9f91fcc15176" containerName="registry" Mar 13 09:25:34 crc kubenswrapper[4841]: I0313 09:25:34.862075 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b11048a2-12d1-437e-80b5-05e10ccc4b50" containerName="extract" Mar 13 09:25:34 crc kubenswrapper[4841]: I0313 09:25:34.862622 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-w54ng" Mar 13 09:25:34 crc kubenswrapper[4841]: I0313 09:25:34.865682 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-w54ng"] Mar 13 09:25:34 crc kubenswrapper[4841]: I0313 09:25:34.868073 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 13 09:25:34 crc kubenswrapper[4841]: I0313 09:25:34.868141 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 13 09:25:34 crc kubenswrapper[4841]: I0313 09:25:34.868527 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-gr5v5" Mar 13 09:25:34 crc kubenswrapper[4841]: I0313 09:25:34.950710 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln8zw\" (UniqueName: \"kubernetes.io/projected/1422b359-9a6d-430e-8cb6-5cf498e32422-kube-api-access-ln8zw\") pod \"nmstate-operator-796d4cfff4-w54ng\" (UID: \"1422b359-9a6d-430e-8cb6-5cf498e32422\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-w54ng" Mar 13 09:25:35 crc kubenswrapper[4841]: I0313 09:25:35.052190 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln8zw\" (UniqueName: \"kubernetes.io/projected/1422b359-9a6d-430e-8cb6-5cf498e32422-kube-api-access-ln8zw\") pod \"nmstate-operator-796d4cfff4-w54ng\" (UID: \"1422b359-9a6d-430e-8cb6-5cf498e32422\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-w54ng" Mar 13 09:25:35 crc kubenswrapper[4841]: I0313 09:25:35.078845 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln8zw\" (UniqueName: \"kubernetes.io/projected/1422b359-9a6d-430e-8cb6-5cf498e32422-kube-api-access-ln8zw\") pod \"nmstate-operator-796d4cfff4-w54ng\" (UID: \"1422b359-9a6d-430e-8cb6-5cf498e32422\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-w54ng" Mar 13 09:25:35 crc kubenswrapper[4841]: I0313 09:25:35.180138 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-w54ng" Mar 13 09:25:35 crc kubenswrapper[4841]: I0313 09:25:35.612848 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-w54ng"] Mar 13 09:25:36 crc kubenswrapper[4841]: I0313 09:25:36.503549 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-w54ng" event={"ID":"1422b359-9a6d-430e-8cb6-5cf498e32422","Type":"ContainerStarted","Data":"aed067f0bf83d56a416c64b2bfe936253fd19136723d83395973cb185bdd5919"} Mar 13 09:25:38 crc kubenswrapper[4841]: I0313 09:25:38.526929 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-w54ng" event={"ID":"1422b359-9a6d-430e-8cb6-5cf498e32422","Type":"ContainerStarted","Data":"899d4405e39bf9cf307a941069142c65adf33002db18fd98813d39c07382122a"} Mar 13 09:25:38 crc kubenswrapper[4841]: I0313 09:25:38.552155 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-w54ng" podStartSLOduration=2.412990822 podStartE2EDuration="4.552125956s" podCreationTimestamp="2026-03-13 09:25:34 +0000 UTC" firstStartedPulling="2026-03-13 09:25:35.620567023 +0000 UTC m=+818.350467224" lastFinishedPulling="2026-03-13 09:25:37.759702167 +0000 UTC m=+820.489602358" observedRunningTime="2026-03-13 09:25:38.549014009 +0000 UTC m=+821.278914260" watchObservedRunningTime="2026-03-13 09:25:38.552125956 +0000 UTC m=+821.282026167" Mar 13 09:25:45 crc kubenswrapper[4841]: I0313 09:25:45.809007 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-c2r92"] Mar 13 09:25:45 crc kubenswrapper[4841]: I0313 09:25:45.810463 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c2r92" Mar 13 09:25:45 crc kubenswrapper[4841]: I0313 09:25:45.813089 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-rpbcf" Mar 13 09:25:45 crc kubenswrapper[4841]: I0313 09:25:45.837919 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-c2r92"] Mar 13 09:25:45 crc kubenswrapper[4841]: I0313 09:25:45.843556 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-nhjzn"] Mar 13 09:25:45 crc kubenswrapper[4841]: I0313 09:25:45.844460 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nhjzn" Mar 13 09:25:45 crc kubenswrapper[4841]: I0313 09:25:45.846434 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 13 09:25:45 crc kubenswrapper[4841]: I0313 09:25:45.851578 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-c69x9"] Mar 13 09:25:45 crc kubenswrapper[4841]: I0313 09:25:45.852496 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-c69x9" Mar 13 09:25:45 crc kubenswrapper[4841]: I0313 09:25:45.855082 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-nhjzn"] Mar 13 09:25:45 crc kubenswrapper[4841]: I0313 09:25:45.943831 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-4tq4s"] Mar 13 09:25:45 crc kubenswrapper[4841]: I0313 09:25:45.944482 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4tq4s" Mar 13 09:25:45 crc kubenswrapper[4841]: I0313 09:25:45.946625 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 13 09:25:45 crc kubenswrapper[4841]: I0313 09:25:45.946701 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 13 09:25:45 crc kubenswrapper[4841]: I0313 09:25:45.946752 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-7qnj9" Mar 13 09:25:45 crc kubenswrapper[4841]: I0313 09:25:45.956942 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-4tq4s"] Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.001194 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2-dbus-socket\") pod \"nmstate-handler-c69x9\" (UID: \"c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2\") " pod="openshift-nmstate/nmstate-handler-c69x9" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.001294 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk945\" (UniqueName: \"kubernetes.io/projected/c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2-kube-api-access-tk945\") pod \"nmstate-handler-c69x9\" (UID: \"c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2\") " pod="openshift-nmstate/nmstate-handler-c69x9" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.001408 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1080bc76-f294-4c2b-8a4b-165d657a4057-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-nhjzn\" (UID: \"1080bc76-f294-4c2b-8a4b-165d657a4057\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nhjzn" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.001450 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2-ovs-socket\") pod \"nmstate-handler-c69x9\" (UID: \"c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2\") " pod="openshift-nmstate/nmstate-handler-c69x9" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.001485 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s87c\" (UniqueName: \"kubernetes.io/projected/4b560358-f566-41b2-a5da-89b9b3c173f3-kube-api-access-8s87c\") pod \"nmstate-metrics-9b8c8685d-c2r92\" (UID: \"4b560358-f566-41b2-a5da-89b9b3c173f3\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c2r92" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.001502 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfwvm\" (UniqueName: \"kubernetes.io/projected/1080bc76-f294-4c2b-8a4b-165d657a4057-kube-api-access-kfwvm\") pod \"nmstate-webhook-5f558f5558-nhjzn\" (UID: \"1080bc76-f294-4c2b-8a4b-165d657a4057\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nhjzn" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.001518 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2-nmstate-lock\") pod \"nmstate-handler-c69x9\" (UID: \"c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2\") " pod="openshift-nmstate/nmstate-handler-c69x9" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.103156 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlrcx\" (UniqueName: \"kubernetes.io/projected/80edf45c-fbb9-4761-995a-010a15e0b1dc-kube-api-access-jlrcx\") pod \"nmstate-console-plugin-86f58fcf4-4tq4s\" (UID: \"80edf45c-fbb9-4761-995a-010a15e0b1dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4tq4s" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.103249 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1080bc76-f294-4c2b-8a4b-165d657a4057-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-nhjzn\" (UID: \"1080bc76-f294-4c2b-8a4b-165d657a4057\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nhjzn" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.103467 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2-ovs-socket\") pod \"nmstate-handler-c69x9\" (UID: \"c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2\") " pod="openshift-nmstate/nmstate-handler-c69x9" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.103504 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2-nmstate-lock\") pod \"nmstate-handler-c69x9\" (UID: \"c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2\") " pod="openshift-nmstate/nmstate-handler-c69x9" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.103529 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s87c\" (UniqueName: \"kubernetes.io/projected/4b560358-f566-41b2-a5da-89b9b3c173f3-kube-api-access-8s87c\") pod \"nmstate-metrics-9b8c8685d-c2r92\" (UID: \"4b560358-f566-41b2-a5da-89b9b3c173f3\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c2r92" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.103552 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfwvm\" (UniqueName: \"kubernetes.io/projected/1080bc76-f294-4c2b-8a4b-165d657a4057-kube-api-access-kfwvm\") pod \"nmstate-webhook-5f558f5558-nhjzn\" (UID: \"1080bc76-f294-4c2b-8a4b-165d657a4057\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nhjzn" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.103573 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2-nmstate-lock\") pod \"nmstate-handler-c69x9\" (UID: \"c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2\") " pod="openshift-nmstate/nmstate-handler-c69x9" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.103581 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2-ovs-socket\") pod \"nmstate-handler-c69x9\" (UID: \"c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2\") " pod="openshift-nmstate/nmstate-handler-c69x9" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.103580 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/80edf45c-fbb9-4761-995a-010a15e0b1dc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-4tq4s\" (UID: \"80edf45c-fbb9-4761-995a-010a15e0b1dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4tq4s" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.103712 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2-dbus-socket\") pod \"nmstate-handler-c69x9\" (UID: \"c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2\") " pod="openshift-nmstate/nmstate-handler-c69x9" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.103776 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/80edf45c-fbb9-4761-995a-010a15e0b1dc-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-4tq4s\" (UID: \"80edf45c-fbb9-4761-995a-010a15e0b1dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4tq4s" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.103871 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk945\" (UniqueName: \"kubernetes.io/projected/c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2-kube-api-access-tk945\") pod \"nmstate-handler-c69x9\" (UID: \"c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2\") " pod="openshift-nmstate/nmstate-handler-c69x9" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.104012 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2-dbus-socket\") pod \"nmstate-handler-c69x9\" (UID: \"c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2\") " pod="openshift-nmstate/nmstate-handler-c69x9" Mar 13 09:25:46 crc kubenswrapper[4841]: E0313 09:25:46.104282 4841 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 13 09:25:46 crc kubenswrapper[4841]: E0313 09:25:46.104357 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1080bc76-f294-4c2b-8a4b-165d657a4057-tls-key-pair podName:1080bc76-f294-4c2b-8a4b-165d657a4057 nodeName:}" failed. No retries permitted until 2026-03-13 09:25:46.604334418 +0000 UTC m=+829.334234689 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/1080bc76-f294-4c2b-8a4b-165d657a4057-tls-key-pair") pod "nmstate-webhook-5f558f5558-nhjzn" (UID: "1080bc76-f294-4c2b-8a4b-165d657a4057") : secret "openshift-nmstate-webhook" not found Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.110952 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-579bccdb4f-j5mt5"] Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.111775 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.119895 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-579bccdb4f-j5mt5"] Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.140419 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s87c\" (UniqueName: \"kubernetes.io/projected/4b560358-f566-41b2-a5da-89b9b3c173f3-kube-api-access-8s87c\") pod \"nmstate-metrics-9b8c8685d-c2r92\" (UID: \"4b560358-f566-41b2-a5da-89b9b3c173f3\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c2r92" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.143978 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk945\" (UniqueName: \"kubernetes.io/projected/c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2-kube-api-access-tk945\") pod \"nmstate-handler-c69x9\" (UID: \"c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2\") " pod="openshift-nmstate/nmstate-handler-c69x9" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.145051 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfwvm\" (UniqueName: \"kubernetes.io/projected/1080bc76-f294-4c2b-8a4b-165d657a4057-kube-api-access-kfwvm\") pod \"nmstate-webhook-5f558f5558-nhjzn\" (UID: \"1080bc76-f294-4c2b-8a4b-165d657a4057\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nhjzn" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.173998 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-c69x9" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.204887 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09d85512-6aae-4e76-a09a-240ff92930a4-service-ca\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.204932 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/80edf45c-fbb9-4761-995a-010a15e0b1dc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-4tq4s\" (UID: \"80edf45c-fbb9-4761-995a-010a15e0b1dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4tq4s" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.204961 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09d85512-6aae-4e76-a09a-240ff92930a4-trusted-ca-bundle\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.205004 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09d85512-6aae-4e76-a09a-240ff92930a4-console-serving-cert\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.205029 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/80edf45c-fbb9-4761-995a-010a15e0b1dc-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-4tq4s\" (UID: \"80edf45c-fbb9-4761-995a-010a15e0b1dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4tq4s" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.205053 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09d85512-6aae-4e76-a09a-240ff92930a4-console-oauth-config\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.205079 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09d85512-6aae-4e76-a09a-240ff92930a4-oauth-serving-cert\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.205106 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09d85512-6aae-4e76-a09a-240ff92930a4-console-config\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.205139 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkqd6\" (UniqueName: \"kubernetes.io/projected/09d85512-6aae-4e76-a09a-240ff92930a4-kube-api-access-zkqd6\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.205177 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlrcx\" (UniqueName: \"kubernetes.io/projected/80edf45c-fbb9-4761-995a-010a15e0b1dc-kube-api-access-jlrcx\") pod \"nmstate-console-plugin-86f58fcf4-4tq4s\" (UID: \"80edf45c-fbb9-4761-995a-010a15e0b1dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4tq4s" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.206213 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/80edf45c-fbb9-4761-995a-010a15e0b1dc-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-4tq4s\" (UID: \"80edf45c-fbb9-4761-995a-010a15e0b1dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4tq4s" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.209748 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/80edf45c-fbb9-4761-995a-010a15e0b1dc-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-4tq4s\" (UID: \"80edf45c-fbb9-4761-995a-010a15e0b1dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4tq4s" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.220323 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlrcx\" (UniqueName: \"kubernetes.io/projected/80edf45c-fbb9-4761-995a-010a15e0b1dc-kube-api-access-jlrcx\") pod \"nmstate-console-plugin-86f58fcf4-4tq4s\" (UID: \"80edf45c-fbb9-4761-995a-010a15e0b1dc\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4tq4s" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.260179 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4tq4s" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.305575 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09d85512-6aae-4e76-a09a-240ff92930a4-trusted-ca-bundle\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.305626 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09d85512-6aae-4e76-a09a-240ff92930a4-console-serving-cert\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.305641 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09d85512-6aae-4e76-a09a-240ff92930a4-console-oauth-config\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.305662 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09d85512-6aae-4e76-a09a-240ff92930a4-oauth-serving-cert\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.305696 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09d85512-6aae-4e76-a09a-240ff92930a4-console-config\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.305714 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkqd6\" (UniqueName: \"kubernetes.io/projected/09d85512-6aae-4e76-a09a-240ff92930a4-kube-api-access-zkqd6\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.305764 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09d85512-6aae-4e76-a09a-240ff92930a4-service-ca\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.306440 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09d85512-6aae-4e76-a09a-240ff92930a4-service-ca\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.306644 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09d85512-6aae-4e76-a09a-240ff92930a4-trusted-ca-bundle\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.307202 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09d85512-6aae-4e76-a09a-240ff92930a4-oauth-serving-cert\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.307813 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09d85512-6aae-4e76-a09a-240ff92930a4-console-config\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.310619 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09d85512-6aae-4e76-a09a-240ff92930a4-console-oauth-config\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.310737 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09d85512-6aae-4e76-a09a-240ff92930a4-console-serving-cert\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.367206 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkqd6\" (UniqueName: \"kubernetes.io/projected/09d85512-6aae-4e76-a09a-240ff92930a4-kube-api-access-zkqd6\") pod \"console-579bccdb4f-j5mt5\" (UID: \"09d85512-6aae-4e76-a09a-240ff92930a4\") " pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.425906 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c2r92" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.440391 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.586159 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-c69x9" event={"ID":"c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2","Type":"ContainerStarted","Data":"cbac9fe53814e098ecf2c6edb64d35c70a3f72fbb14e51fcb0250e2cc1fd6e0e"} Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.608023 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1080bc76-f294-4c2b-8a4b-165d657a4057-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-nhjzn\" (UID: \"1080bc76-f294-4c2b-8a4b-165d657a4057\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nhjzn" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.612625 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1080bc76-f294-4c2b-8a4b-165d657a4057-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-nhjzn\" (UID: \"1080bc76-f294-4c2b-8a4b-165d657a4057\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nhjzn" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.651071 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-579bccdb4f-j5mt5"] Mar 13 09:25:46 crc kubenswrapper[4841]: W0313 09:25:46.656342 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09d85512_6aae_4e76_a09a_240ff92930a4.slice/crio-30e60664a04cbafdd785e4cc136354710e2bb7fac036354898a5d57e17f859c8 WatchSource:0}: Error finding container 30e60664a04cbafdd785e4cc136354710e2bb7fac036354898a5d57e17f859c8: Status 404 returned error can't find the container with id 30e60664a04cbafdd785e4cc136354710e2bb7fac036354898a5d57e17f859c8 Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.680178 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-4tq4s"] Mar 13 09:25:46 crc kubenswrapper[4841]: W0313 09:25:46.687281 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80edf45c_fbb9_4761_995a_010a15e0b1dc.slice/crio-29549a0f3ce36dc8b221caca9622c42a01d48618373f94c127f26eb51cc35e56 WatchSource:0}: Error finding container 29549a0f3ce36dc8b221caca9622c42a01d48618373f94c127f26eb51cc35e56: Status 404 returned error can't find the container with id 29549a0f3ce36dc8b221caca9622c42a01d48618373f94c127f26eb51cc35e56 Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.758832 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nhjzn" Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.840135 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-c2r92"] Mar 13 09:25:46 crc kubenswrapper[4841]: W0313 09:25:46.851698 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b560358_f566_41b2_a5da_89b9b3c173f3.slice/crio-4172e1a7513ea76600eab9e1802eba08ed42b78595787af43f47e02b81731d31 WatchSource:0}: Error finding container 4172e1a7513ea76600eab9e1802eba08ed42b78595787af43f47e02b81731d31: Status 404 returned error can't find the container with id 4172e1a7513ea76600eab9e1802eba08ed42b78595787af43f47e02b81731d31 Mar 13 09:25:46 crc kubenswrapper[4841]: I0313 09:25:46.978611 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-nhjzn"] Mar 13 09:25:47 crc kubenswrapper[4841]: I0313 09:25:47.596643 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c2r92" event={"ID":"4b560358-f566-41b2-a5da-89b9b3c173f3","Type":"ContainerStarted","Data":"4172e1a7513ea76600eab9e1802eba08ed42b78595787af43f47e02b81731d31"} Mar 13 09:25:47 crc kubenswrapper[4841]: I0313 09:25:47.598134 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nhjzn" event={"ID":"1080bc76-f294-4c2b-8a4b-165d657a4057","Type":"ContainerStarted","Data":"15ea35d73907a448c78f515ed722589394e982bc28de61dbf35a15b0c902c9fb"} Mar 13 09:25:47 crc kubenswrapper[4841]: I0313 09:25:47.600024 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-579bccdb4f-j5mt5" event={"ID":"09d85512-6aae-4e76-a09a-240ff92930a4","Type":"ContainerStarted","Data":"57830f89f57bd13d68279e93c111628fb285a1a95da812303bd6f6539ac0645f"} Mar 13 09:25:47 crc kubenswrapper[4841]: I0313 09:25:47.600077 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-579bccdb4f-j5mt5" event={"ID":"09d85512-6aae-4e76-a09a-240ff92930a4","Type":"ContainerStarted","Data":"30e60664a04cbafdd785e4cc136354710e2bb7fac036354898a5d57e17f859c8"} Mar 13 09:25:47 crc kubenswrapper[4841]: I0313 09:25:47.603217 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4tq4s" event={"ID":"80edf45c-fbb9-4761-995a-010a15e0b1dc","Type":"ContainerStarted","Data":"29549a0f3ce36dc8b221caca9622c42a01d48618373f94c127f26eb51cc35e56"} Mar 13 09:25:47 crc kubenswrapper[4841]: I0313 09:25:47.633491 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-579bccdb4f-j5mt5" podStartSLOduration=1.633461351 podStartE2EDuration="1.633461351s" podCreationTimestamp="2026-03-13 09:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:25:47.630431186 +0000 UTC m=+830.360331417" watchObservedRunningTime="2026-03-13 09:25:47.633461351 +0000 UTC m=+830.363361582" Mar 13 09:25:49 crc kubenswrapper[4841]: I0313 09:25:49.617077 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nhjzn" event={"ID":"1080bc76-f294-4c2b-8a4b-165d657a4057","Type":"ContainerStarted","Data":"244fc6f35d27e5071897e31956e49e1a201dd17ed327c5dd26e843247a922c27"} Mar 13 09:25:49 crc kubenswrapper[4841]: I0313 09:25:49.617494 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nhjzn" Mar 13 09:25:49 crc kubenswrapper[4841]: I0313 09:25:49.619936 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-c69x9" event={"ID":"c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2","Type":"ContainerStarted","Data":"4387183bac6bb2dc8fba35c0987fec381f4906dd7def1accc2c2489e6b38cd5c"} Mar 13 09:25:49 crc kubenswrapper[4841]: I0313 09:25:49.620061 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-c69x9" Mar 13 09:25:49 crc kubenswrapper[4841]: I0313 09:25:49.623204 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4tq4s" event={"ID":"80edf45c-fbb9-4761-995a-010a15e0b1dc","Type":"ContainerStarted","Data":"2c0bec8cc8b45d5350103e9d004e7efcf88924a6beb65f1fa83ff8ea61e44ac0"} Mar 13 09:25:49 crc kubenswrapper[4841]: I0313 09:25:49.625074 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c2r92" event={"ID":"4b560358-f566-41b2-a5da-89b9b3c173f3","Type":"ContainerStarted","Data":"77f6722a3d8cf8f4278fd61fd8bd18e2198b1d65b0d54aabe34366ab63ac26ec"} Mar 13 09:25:49 crc kubenswrapper[4841]: I0313 09:25:49.657853 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-4tq4s" podStartSLOduration=2.07836018 podStartE2EDuration="4.657827207s" podCreationTimestamp="2026-03-13 09:25:45 +0000 UTC" firstStartedPulling="2026-03-13 09:25:46.692541336 +0000 UTC m=+829.422441527" lastFinishedPulling="2026-03-13 09:25:49.272008373 +0000 UTC m=+832.001908554" observedRunningTime="2026-03-13 09:25:49.654980208 +0000 UTC m=+832.384880449" watchObservedRunningTime="2026-03-13 09:25:49.657827207 +0000 UTC m=+832.387727428" Mar 13 09:25:49 crc kubenswrapper[4841]: I0313 09:25:49.658855 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nhjzn" podStartSLOduration=2.353784453 podStartE2EDuration="4.658844968s" podCreationTimestamp="2026-03-13 09:25:45 +0000 UTC" firstStartedPulling="2026-03-13 09:25:46.986242453 +0000 UTC m=+829.716142644" lastFinishedPulling="2026-03-13 09:25:49.291302978 +0000 UTC m=+832.021203159" observedRunningTime="2026-03-13 09:25:49.640601997 +0000 UTC m=+832.370502218" watchObservedRunningTime="2026-03-13 09:25:49.658844968 +0000 UTC m=+832.388745189" Mar 13 09:25:49 crc kubenswrapper[4841]: I0313 09:25:49.687341 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-c69x9" podStartSLOduration=1.6087885320000002 podStartE2EDuration="4.687312751s" podCreationTimestamp="2026-03-13 09:25:45 +0000 UTC" firstStartedPulling="2026-03-13 09:25:46.195252999 +0000 UTC m=+828.925153190" lastFinishedPulling="2026-03-13 09:25:49.273777208 +0000 UTC m=+832.003677409" observedRunningTime="2026-03-13 09:25:49.681669104 +0000 UTC m=+832.411569315" watchObservedRunningTime="2026-03-13 09:25:49.687312751 +0000 UTC m=+832.417212982" Mar 13 09:25:52 crc kubenswrapper[4841]: I0313 09:25:52.650197 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c2r92" event={"ID":"4b560358-f566-41b2-a5da-89b9b3c173f3","Type":"ContainerStarted","Data":"641a09f8066d9110cfcd6aca84d07399559fed182c07891c71995b1ce29e8f39"} Mar 13 09:25:52 crc kubenswrapper[4841]: I0313 09:25:52.679423 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c2r92" podStartSLOduration=3.006819964 podStartE2EDuration="7.679346499s" podCreationTimestamp="2026-03-13 09:25:45 +0000 UTC" firstStartedPulling="2026-03-13 09:25:46.854051679 +0000 UTC m=+829.583951890" lastFinishedPulling="2026-03-13 09:25:51.526578234 +0000 UTC m=+834.256478425" observedRunningTime="2026-03-13 09:25:52.670950316 +0000 UTC m=+835.400850527" watchObservedRunningTime="2026-03-13 09:25:52.679346499 +0000 UTC m=+835.409246730" Mar 13 09:25:56 crc kubenswrapper[4841]: I0313 09:25:56.210372 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-c69x9" Mar 13 09:25:56 crc kubenswrapper[4841]: I0313 09:25:56.440544 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:56 crc kubenswrapper[4841]: I0313 09:25:56.440613 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:56 crc kubenswrapper[4841]: I0313 09:25:56.448622 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:56 crc kubenswrapper[4841]: I0313 09:25:56.695850 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-579bccdb4f-j5mt5" Mar 13 09:25:56 crc kubenswrapper[4841]: I0313 09:25:56.786110 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-v6vhc"] Mar 13 09:25:58 crc kubenswrapper[4841]: I0313 09:25:58.550917 4841 scope.go:117] "RemoveContainer" containerID="6c0aa4158655998115dffb79275b474bfa6a52419a51c1461a8691c2c590ffc0" Mar 13 09:26:00 crc kubenswrapper[4841]: I0313 09:26:00.133353 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556566-vw6lq"] Mar 13 09:26:00 crc kubenswrapper[4841]: I0313 09:26:00.136137 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556566-vw6lq" Mar 13 09:26:00 crc kubenswrapper[4841]: I0313 09:26:00.139812 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:26:00 crc kubenswrapper[4841]: I0313 09:26:00.141728 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:26:00 crc kubenswrapper[4841]: I0313 09:26:00.144311 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:26:00 crc kubenswrapper[4841]: I0313 09:26:00.147570 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556566-vw6lq"] Mar 13 09:26:00 crc kubenswrapper[4841]: I0313 09:26:00.219616 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br549\" (UniqueName: \"kubernetes.io/projected/0ce7ba8a-b324-419e-92bb-f5a845a15025-kube-api-access-br549\") pod \"auto-csr-approver-29556566-vw6lq\" (UID: \"0ce7ba8a-b324-419e-92bb-f5a845a15025\") " pod="openshift-infra/auto-csr-approver-29556566-vw6lq" Mar 13 09:26:00 crc kubenswrapper[4841]: I0313 09:26:00.322185 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br549\" (UniqueName: \"kubernetes.io/projected/0ce7ba8a-b324-419e-92bb-f5a845a15025-kube-api-access-br549\") pod \"auto-csr-approver-29556566-vw6lq\" (UID: \"0ce7ba8a-b324-419e-92bb-f5a845a15025\") " pod="openshift-infra/auto-csr-approver-29556566-vw6lq" Mar 13 09:26:00 crc kubenswrapper[4841]: I0313 09:26:00.359231 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br549\" (UniqueName: \"kubernetes.io/projected/0ce7ba8a-b324-419e-92bb-f5a845a15025-kube-api-access-br549\") pod \"auto-csr-approver-29556566-vw6lq\" (UID: \"0ce7ba8a-b324-419e-92bb-f5a845a15025\") " pod="openshift-infra/auto-csr-approver-29556566-vw6lq" Mar 13 09:26:00 crc kubenswrapper[4841]: I0313 09:26:00.512035 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556566-vw6lq" Mar 13 09:26:01 crc kubenswrapper[4841]: I0313 09:26:01.000924 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556566-vw6lq"] Mar 13 09:26:01 crc kubenswrapper[4841]: W0313 09:26:01.015403 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ce7ba8a_b324_419e_92bb_f5a845a15025.slice/crio-2954f29500e01b9ca0f4dfb8df1563bde61fb1d119ff8c87706ce82e134052c1 WatchSource:0}: Error finding container 2954f29500e01b9ca0f4dfb8df1563bde61fb1d119ff8c87706ce82e134052c1: Status 404 returned error can't find the container with id 2954f29500e01b9ca0f4dfb8df1563bde61fb1d119ff8c87706ce82e134052c1 Mar 13 09:26:01 crc kubenswrapper[4841]: I0313 09:26:01.725128 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556566-vw6lq" event={"ID":"0ce7ba8a-b324-419e-92bb-f5a845a15025","Type":"ContainerStarted","Data":"2954f29500e01b9ca0f4dfb8df1563bde61fb1d119ff8c87706ce82e134052c1"} Mar 13 09:26:02 crc kubenswrapper[4841]: I0313 09:26:02.734825 4841 generic.go:334] "Generic (PLEG): container finished" podID="0ce7ba8a-b324-419e-92bb-f5a845a15025" containerID="d8cf0620e340338defdbf5ee4bad93c9ac30dbf5c0a4c99a8655ab88289210bb" exitCode=0 Mar 13 09:26:02 crc kubenswrapper[4841]: I0313 09:26:02.735342 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556566-vw6lq" event={"ID":"0ce7ba8a-b324-419e-92bb-f5a845a15025","Type":"ContainerDied","Data":"d8cf0620e340338defdbf5ee4bad93c9ac30dbf5c0a4c99a8655ab88289210bb"} Mar 13 09:26:04 crc kubenswrapper[4841]: I0313 09:26:04.136015 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556566-vw6lq" Mar 13 09:26:04 crc kubenswrapper[4841]: I0313 09:26:04.175231 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br549\" (UniqueName: \"kubernetes.io/projected/0ce7ba8a-b324-419e-92bb-f5a845a15025-kube-api-access-br549\") pod \"0ce7ba8a-b324-419e-92bb-f5a845a15025\" (UID: \"0ce7ba8a-b324-419e-92bb-f5a845a15025\") " Mar 13 09:26:04 crc kubenswrapper[4841]: I0313 09:26:04.184630 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ce7ba8a-b324-419e-92bb-f5a845a15025-kube-api-access-br549" (OuterVolumeSpecName: "kube-api-access-br549") pod "0ce7ba8a-b324-419e-92bb-f5a845a15025" (UID: "0ce7ba8a-b324-419e-92bb-f5a845a15025"). InnerVolumeSpecName "kube-api-access-br549". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:26:04 crc kubenswrapper[4841]: I0313 09:26:04.277160 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br549\" (UniqueName: \"kubernetes.io/projected/0ce7ba8a-b324-419e-92bb-f5a845a15025-kube-api-access-br549\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:04 crc kubenswrapper[4841]: I0313 09:26:04.407793 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:26:04 crc kubenswrapper[4841]: I0313 09:26:04.407867 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:26:04 crc kubenswrapper[4841]: I0313 09:26:04.407914 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:26:04 crc kubenswrapper[4841]: I0313 09:26:04.408398 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ed3bccb1da12fcd7dcfabd48b6eee04f275c5f16e821a0e4d8dce433f764913"} pod="openshift-machine-config-operator/machine-config-daemon-h227v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 09:26:04 crc kubenswrapper[4841]: I0313 09:26:04.408466 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" containerID="cri-o://5ed3bccb1da12fcd7dcfabd48b6eee04f275c5f16e821a0e4d8dce433f764913" gracePeriod=600 Mar 13 09:26:04 crc kubenswrapper[4841]: I0313 09:26:04.751082 4841 generic.go:334] "Generic (PLEG): container finished" podID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerID="5ed3bccb1da12fcd7dcfabd48b6eee04f275c5f16e821a0e4d8dce433f764913" exitCode=0 Mar 13 09:26:04 crc kubenswrapper[4841]: I0313 09:26:04.751141 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerDied","Data":"5ed3bccb1da12fcd7dcfabd48b6eee04f275c5f16e821a0e4d8dce433f764913"} Mar 13 09:26:04 crc kubenswrapper[4841]: I0313 09:26:04.751166 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"6491cd972e8f18473231b5b2215720345c96ab2a0337886960a5d983df3b0e59"} Mar 13 09:26:04 crc kubenswrapper[4841]: I0313 09:26:04.751183 4841 scope.go:117] "RemoveContainer" containerID="379cc8a3a48d5aedfd454910bea9e163183fcf7244dc7566a308379fd2d7c084" Mar 13 09:26:04 crc kubenswrapper[4841]: I0313 09:26:04.755129 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556566-vw6lq" event={"ID":"0ce7ba8a-b324-419e-92bb-f5a845a15025","Type":"ContainerDied","Data":"2954f29500e01b9ca0f4dfb8df1563bde61fb1d119ff8c87706ce82e134052c1"} Mar 13 09:26:04 crc kubenswrapper[4841]: I0313 09:26:04.755150 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2954f29500e01b9ca0f4dfb8df1563bde61fb1d119ff8c87706ce82e134052c1" Mar 13 09:26:04 crc kubenswrapper[4841]: I0313 09:26:04.755188 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556566-vw6lq" Mar 13 09:26:05 crc kubenswrapper[4841]: I0313 09:26:05.182945 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556560-ckrz8"] Mar 13 09:26:05 crc kubenswrapper[4841]: I0313 09:26:05.187463 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556560-ckrz8"] Mar 13 09:26:06 crc kubenswrapper[4841]: I0313 09:26:06.007990 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d192b4a7-053d-4330-8b96-7f96ed70ea05" path="/var/lib/kubelet/pods/d192b4a7-053d-4330-8b96-7f96ed70ea05/volumes" Mar 13 09:26:06 crc kubenswrapper[4841]: I0313 09:26:06.775737 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nhjzn" Mar 13 09:26:13 crc kubenswrapper[4841]: I0313 09:26:13.710653 4841 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 09:26:20 crc kubenswrapper[4841]: I0313 09:26:20.936753 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s"] Mar 13 09:26:20 crc kubenswrapper[4841]: E0313 09:26:20.938078 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce7ba8a-b324-419e-92bb-f5a845a15025" containerName="oc" Mar 13 09:26:20 crc kubenswrapper[4841]: I0313 09:26:20.938102 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce7ba8a-b324-419e-92bb-f5a845a15025" containerName="oc" Mar 13 09:26:20 crc kubenswrapper[4841]: I0313 09:26:20.938369 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce7ba8a-b324-419e-92bb-f5a845a15025" containerName="oc" Mar 13 09:26:20 crc kubenswrapper[4841]: I0313 09:26:20.939757 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s" Mar 13 09:26:20 crc kubenswrapper[4841]: I0313 09:26:20.942755 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 09:26:20 crc kubenswrapper[4841]: I0313 09:26:20.944806 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s"] Mar 13 09:26:21 crc kubenswrapper[4841]: I0313 09:26:21.022182 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnzmg\" (UniqueName: \"kubernetes.io/projected/8b79260a-a276-45fa-abfd-5d471f82142a-kube-api-access-dnzmg\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s\" (UID: \"8b79260a-a276-45fa-abfd-5d471f82142a\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s" Mar 13 09:26:21 crc kubenswrapper[4841]: I0313 09:26:21.022629 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b79260a-a276-45fa-abfd-5d471f82142a-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s\" (UID: \"8b79260a-a276-45fa-abfd-5d471f82142a\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s" Mar 13 09:26:21 crc kubenswrapper[4841]: I0313 09:26:21.022739 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b79260a-a276-45fa-abfd-5d471f82142a-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s\" (UID: \"8b79260a-a276-45fa-abfd-5d471f82142a\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s" Mar 13 09:26:21 crc kubenswrapper[4841]: I0313 09:26:21.124123 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b79260a-a276-45fa-abfd-5d471f82142a-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s\" (UID: \"8b79260a-a276-45fa-abfd-5d471f82142a\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s" Mar 13 09:26:21 crc kubenswrapper[4841]: I0313 09:26:21.124192 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnzmg\" (UniqueName: \"kubernetes.io/projected/8b79260a-a276-45fa-abfd-5d471f82142a-kube-api-access-dnzmg\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s\" (UID: \"8b79260a-a276-45fa-abfd-5d471f82142a\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s" Mar 13 09:26:21 crc kubenswrapper[4841]: I0313 09:26:21.124239 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b79260a-a276-45fa-abfd-5d471f82142a-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s\" (UID: \"8b79260a-a276-45fa-abfd-5d471f82142a\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s" Mar 13 09:26:21 crc kubenswrapper[4841]: I0313 09:26:21.124694 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b79260a-a276-45fa-abfd-5d471f82142a-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s\" (UID: \"8b79260a-a276-45fa-abfd-5d471f82142a\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s" Mar 13 09:26:21 crc kubenswrapper[4841]: I0313 09:26:21.124694 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b79260a-a276-45fa-abfd-5d471f82142a-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s\" (UID: \"8b79260a-a276-45fa-abfd-5d471f82142a\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s" Mar 13 09:26:21 crc kubenswrapper[4841]: I0313 09:26:21.148911 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnzmg\" (UniqueName: \"kubernetes.io/projected/8b79260a-a276-45fa-abfd-5d471f82142a-kube-api-access-dnzmg\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s\" (UID: \"8b79260a-a276-45fa-abfd-5d471f82142a\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s" Mar 13 09:26:21 crc kubenswrapper[4841]: I0313 09:26:21.269605 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s" Mar 13 09:26:21 crc kubenswrapper[4841]: I0313 09:26:21.568325 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s"] Mar 13 09:26:21 crc kubenswrapper[4841]: W0313 09:26:21.577788 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b79260a_a276_45fa_abfd_5d471f82142a.slice/crio-f5b66634b7c8ba2b382c34b7a463776e5cfa27bebc48e7539800ae07271c98dc WatchSource:0}: Error finding container f5b66634b7c8ba2b382c34b7a463776e5cfa27bebc48e7539800ae07271c98dc: Status 404 returned error can't find the container with id f5b66634b7c8ba2b382c34b7a463776e5cfa27bebc48e7539800ae07271c98dc Mar 13 09:26:21 crc kubenswrapper[4841]: I0313 09:26:21.840135 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-v6vhc" podUID="60193856-8a3f-4ce0-b79d-44e58de19b06" containerName="console" containerID="cri-o://7ebd0bf34b0eb12d099c0e9b051e9dcccfc5a04f9e683dae8b7ee332d28829f7" gracePeriod=15 Mar 13 09:26:21 crc kubenswrapper[4841]: I0313 09:26:21.890730 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s" event={"ID":"8b79260a-a276-45fa-abfd-5d471f82142a","Type":"ContainerStarted","Data":"f5b66634b7c8ba2b382c34b7a463776e5cfa27bebc48e7539800ae07271c98dc"} Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.739709 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-v6vhc_60193856-8a3f-4ce0-b79d-44e58de19b06/console/0.log" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.740019 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.864197 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/60193856-8a3f-4ce0-b79d-44e58de19b06-console-serving-cert\") pod \"60193856-8a3f-4ce0-b79d-44e58de19b06\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.864320 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-trusted-ca-bundle\") pod \"60193856-8a3f-4ce0-b79d-44e58de19b06\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.864368 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-oauth-serving-cert\") pod \"60193856-8a3f-4ce0-b79d-44e58de19b06\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.864481 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-service-ca\") pod \"60193856-8a3f-4ce0-b79d-44e58de19b06\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.864560 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/60193856-8a3f-4ce0-b79d-44e58de19b06-console-oauth-config\") pod \"60193856-8a3f-4ce0-b79d-44e58de19b06\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.864606 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-console-config\") pod \"60193856-8a3f-4ce0-b79d-44e58de19b06\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.864690 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m29zc\" (UniqueName: \"kubernetes.io/projected/60193856-8a3f-4ce0-b79d-44e58de19b06-kube-api-access-m29zc\") pod \"60193856-8a3f-4ce0-b79d-44e58de19b06\" (UID: \"60193856-8a3f-4ce0-b79d-44e58de19b06\") " Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.866614 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "60193856-8a3f-4ce0-b79d-44e58de19b06" (UID: "60193856-8a3f-4ce0-b79d-44e58de19b06"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.866973 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-service-ca" (OuterVolumeSpecName: "service-ca") pod "60193856-8a3f-4ce0-b79d-44e58de19b06" (UID: "60193856-8a3f-4ce0-b79d-44e58de19b06"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.867336 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-console-config" (OuterVolumeSpecName: "console-config") pod "60193856-8a3f-4ce0-b79d-44e58de19b06" (UID: "60193856-8a3f-4ce0-b79d-44e58de19b06"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.868103 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "60193856-8a3f-4ce0-b79d-44e58de19b06" (UID: "60193856-8a3f-4ce0-b79d-44e58de19b06"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.873638 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60193856-8a3f-4ce0-b79d-44e58de19b06-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "60193856-8a3f-4ce0-b79d-44e58de19b06" (UID: "60193856-8a3f-4ce0-b79d-44e58de19b06"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.873972 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60193856-8a3f-4ce0-b79d-44e58de19b06-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "60193856-8a3f-4ce0-b79d-44e58de19b06" (UID: "60193856-8a3f-4ce0-b79d-44e58de19b06"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.875046 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60193856-8a3f-4ce0-b79d-44e58de19b06-kube-api-access-m29zc" (OuterVolumeSpecName: "kube-api-access-m29zc") pod "60193856-8a3f-4ce0-b79d-44e58de19b06" (UID: "60193856-8a3f-4ce0-b79d-44e58de19b06"). InnerVolumeSpecName "kube-api-access-m29zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.903943 4841 generic.go:334] "Generic (PLEG): container finished" podID="8b79260a-a276-45fa-abfd-5d471f82142a" containerID="149860592a2036207fcce7a060b2df660643650dbf605c542f0da1393039fd1b" exitCode=0 Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.904011 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s" event={"ID":"8b79260a-a276-45fa-abfd-5d471f82142a","Type":"ContainerDied","Data":"149860592a2036207fcce7a060b2df660643650dbf605c542f0da1393039fd1b"} Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.911406 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-v6vhc_60193856-8a3f-4ce0-b79d-44e58de19b06/console/0.log" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.911448 4841 generic.go:334] "Generic (PLEG): container finished" podID="60193856-8a3f-4ce0-b79d-44e58de19b06" containerID="7ebd0bf34b0eb12d099c0e9b051e9dcccfc5a04f9e683dae8b7ee332d28829f7" exitCode=2 Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.911475 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v6vhc" event={"ID":"60193856-8a3f-4ce0-b79d-44e58de19b06","Type":"ContainerDied","Data":"7ebd0bf34b0eb12d099c0e9b051e9dcccfc5a04f9e683dae8b7ee332d28829f7"} Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.911495 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v6vhc" event={"ID":"60193856-8a3f-4ce0-b79d-44e58de19b06","Type":"ContainerDied","Data":"00fc187cb3ba96d7fcaa5d61f31452043338a7de0e9d2814285f512a8d79f9c3"} Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.911514 4841 scope.go:117] "RemoveContainer" containerID="7ebd0bf34b0eb12d099c0e9b051e9dcccfc5a04f9e683dae8b7ee332d28829f7" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.911616 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v6vhc" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.964019 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-v6vhc"] Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.966204 4841 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.966234 4841 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/60193856-8a3f-4ce0-b79d-44e58de19b06-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.966247 4841 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.966258 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m29zc\" (UniqueName: \"kubernetes.io/projected/60193856-8a3f-4ce0-b79d-44e58de19b06-kube-api-access-m29zc\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.966301 4841 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/60193856-8a3f-4ce0-b79d-44e58de19b06-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.966314 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.966324 4841 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/60193856-8a3f-4ce0-b79d-44e58de19b06-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.968461 4841 scope.go:117] "RemoveContainer" containerID="7ebd0bf34b0eb12d099c0e9b051e9dcccfc5a04f9e683dae8b7ee332d28829f7" Mar 13 09:26:22 crc kubenswrapper[4841]: E0313 09:26:22.969086 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ebd0bf34b0eb12d099c0e9b051e9dcccfc5a04f9e683dae8b7ee332d28829f7\": container with ID starting with 7ebd0bf34b0eb12d099c0e9b051e9dcccfc5a04f9e683dae8b7ee332d28829f7 not found: ID does not exist" containerID="7ebd0bf34b0eb12d099c0e9b051e9dcccfc5a04f9e683dae8b7ee332d28829f7" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.969119 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ebd0bf34b0eb12d099c0e9b051e9dcccfc5a04f9e683dae8b7ee332d28829f7"} err="failed to get container status \"7ebd0bf34b0eb12d099c0e9b051e9dcccfc5a04f9e683dae8b7ee332d28829f7\": rpc error: code = NotFound desc = could not find container \"7ebd0bf34b0eb12d099c0e9b051e9dcccfc5a04f9e683dae8b7ee332d28829f7\": container with ID starting with 7ebd0bf34b0eb12d099c0e9b051e9dcccfc5a04f9e683dae8b7ee332d28829f7 not found: ID does not exist" Mar 13 09:26:22 crc kubenswrapper[4841]: I0313 09:26:22.970355 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-v6vhc"] Mar 13 09:26:23 crc kubenswrapper[4841]: I0313 09:26:23.287103 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dcmh8"] Mar 13 09:26:23 crc kubenswrapper[4841]: E0313 09:26:23.289383 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60193856-8a3f-4ce0-b79d-44e58de19b06" containerName="console" Mar 13 09:26:23 crc kubenswrapper[4841]: I0313 09:26:23.289431 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="60193856-8a3f-4ce0-b79d-44e58de19b06" containerName="console" Mar 13 09:26:23 crc kubenswrapper[4841]: I0313 09:26:23.289707 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="60193856-8a3f-4ce0-b79d-44e58de19b06" containerName="console" Mar 13 09:26:23 crc kubenswrapper[4841]: I0313 09:26:23.291437 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dcmh8" Mar 13 09:26:23 crc kubenswrapper[4841]: I0313 09:26:23.311682 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dcmh8"] Mar 13 09:26:23 crc kubenswrapper[4841]: I0313 09:26:23.371861 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48ce3356-a536-4896-91db-35ccc6f131e9-catalog-content\") pod \"redhat-operators-dcmh8\" (UID: \"48ce3356-a536-4896-91db-35ccc6f131e9\") " pod="openshift-marketplace/redhat-operators-dcmh8" Mar 13 09:26:23 crc kubenswrapper[4841]: I0313 09:26:23.371998 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48ce3356-a536-4896-91db-35ccc6f131e9-utilities\") pod \"redhat-operators-dcmh8\" (UID: \"48ce3356-a536-4896-91db-35ccc6f131e9\") " pod="openshift-marketplace/redhat-operators-dcmh8" Mar 13 09:26:23 crc kubenswrapper[4841]: I0313 09:26:23.372119 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hkdf\" (UniqueName: \"kubernetes.io/projected/48ce3356-a536-4896-91db-35ccc6f131e9-kube-api-access-8hkdf\") pod \"redhat-operators-dcmh8\" (UID: \"48ce3356-a536-4896-91db-35ccc6f131e9\") " pod="openshift-marketplace/redhat-operators-dcmh8" Mar 13 09:26:23 crc kubenswrapper[4841]: I0313 09:26:23.474157 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hkdf\" (UniqueName: \"kubernetes.io/projected/48ce3356-a536-4896-91db-35ccc6f131e9-kube-api-access-8hkdf\") pod \"redhat-operators-dcmh8\" (UID: \"48ce3356-a536-4896-91db-35ccc6f131e9\") " pod="openshift-marketplace/redhat-operators-dcmh8" Mar 13 09:26:23 crc kubenswrapper[4841]: I0313 09:26:23.474354 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48ce3356-a536-4896-91db-35ccc6f131e9-catalog-content\") pod \"redhat-operators-dcmh8\" (UID: \"48ce3356-a536-4896-91db-35ccc6f131e9\") " pod="openshift-marketplace/redhat-operators-dcmh8" Mar 13 09:26:23 crc kubenswrapper[4841]: I0313 09:26:23.474420 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48ce3356-a536-4896-91db-35ccc6f131e9-utilities\") pod \"redhat-operators-dcmh8\" (UID: \"48ce3356-a536-4896-91db-35ccc6f131e9\") " pod="openshift-marketplace/redhat-operators-dcmh8" Mar 13 09:26:23 crc kubenswrapper[4841]: I0313 09:26:23.475479 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48ce3356-a536-4896-91db-35ccc6f131e9-utilities\") pod \"redhat-operators-dcmh8\" (UID: \"48ce3356-a536-4896-91db-35ccc6f131e9\") " pod="openshift-marketplace/redhat-operators-dcmh8" Mar 13 09:26:23 crc kubenswrapper[4841]: I0313 09:26:23.475581 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48ce3356-a536-4896-91db-35ccc6f131e9-catalog-content\") pod \"redhat-operators-dcmh8\" (UID: \"48ce3356-a536-4896-91db-35ccc6f131e9\") " pod="openshift-marketplace/redhat-operators-dcmh8" Mar 13 09:26:23 crc kubenswrapper[4841]: I0313 09:26:23.504451 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hkdf\" (UniqueName: \"kubernetes.io/projected/48ce3356-a536-4896-91db-35ccc6f131e9-kube-api-access-8hkdf\") pod \"redhat-operators-dcmh8\" (UID: \"48ce3356-a536-4896-91db-35ccc6f131e9\") " pod="openshift-marketplace/redhat-operators-dcmh8" Mar 13 09:26:23 crc kubenswrapper[4841]: I0313 09:26:23.612897 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dcmh8" Mar 13 09:26:23 crc kubenswrapper[4841]: I0313 09:26:23.904288 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dcmh8"] Mar 13 09:26:23 crc kubenswrapper[4841]: W0313 09:26:23.908652 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48ce3356_a536_4896_91db_35ccc6f131e9.slice/crio-386f2ea13b748d98939436be7c172abdb0290671ba68bf5e3b293d20dd1d15b6 WatchSource:0}: Error finding container 386f2ea13b748d98939436be7c172abdb0290671ba68bf5e3b293d20dd1d15b6: Status 404 returned error can't find the container with id 386f2ea13b748d98939436be7c172abdb0290671ba68bf5e3b293d20dd1d15b6 Mar 13 09:26:23 crc kubenswrapper[4841]: I0313 09:26:23.919007 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dcmh8" event={"ID":"48ce3356-a536-4896-91db-35ccc6f131e9","Type":"ContainerStarted","Data":"386f2ea13b748d98939436be7c172abdb0290671ba68bf5e3b293d20dd1d15b6"} Mar 13 09:26:24 crc kubenswrapper[4841]: I0313 09:26:24.001116 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60193856-8a3f-4ce0-b79d-44e58de19b06" path="/var/lib/kubelet/pods/60193856-8a3f-4ce0-b79d-44e58de19b06/volumes" Mar 13 09:26:24 crc kubenswrapper[4841]: I0313 09:26:24.937546 4841 generic.go:334] "Generic (PLEG): container finished" podID="8b79260a-a276-45fa-abfd-5d471f82142a" containerID="a0bbae7b6caf02e4303b881ea1c8f54f00599b78ed652ef36a9b9c722cd5389a" exitCode=0 Mar 13 09:26:24 crc kubenswrapper[4841]: I0313 09:26:24.938065 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s" event={"ID":"8b79260a-a276-45fa-abfd-5d471f82142a","Type":"ContainerDied","Data":"a0bbae7b6caf02e4303b881ea1c8f54f00599b78ed652ef36a9b9c722cd5389a"} Mar 13 09:26:24 crc kubenswrapper[4841]: I0313 09:26:24.945491 4841 generic.go:334] "Generic (PLEG): container finished" podID="48ce3356-a536-4896-91db-35ccc6f131e9" containerID="3421de706ca9b0cd7e0e4db17f195b2e7ccc978765b2a70b14e473ff792788aa" exitCode=0 Mar 13 09:26:24 crc kubenswrapper[4841]: I0313 09:26:24.945584 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dcmh8" event={"ID":"48ce3356-a536-4896-91db-35ccc6f131e9","Type":"ContainerDied","Data":"3421de706ca9b0cd7e0e4db17f195b2e7ccc978765b2a70b14e473ff792788aa"} Mar 13 09:26:25 crc kubenswrapper[4841]: I0313 09:26:25.954671 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dcmh8" event={"ID":"48ce3356-a536-4896-91db-35ccc6f131e9","Type":"ContainerStarted","Data":"6d35824799da04711e82a01921e56955adfde3445c713e7332d17ca48b5dcb08"} Mar 13 09:26:25 crc kubenswrapper[4841]: I0313 09:26:25.957790 4841 generic.go:334] "Generic (PLEG): container finished" podID="8b79260a-a276-45fa-abfd-5d471f82142a" containerID="534b417fd1e88eb9523c675a6e2a8c0aa434e5a84f5b7c0002eb1e4222a85701" exitCode=0 Mar 13 09:26:25 crc kubenswrapper[4841]: I0313 09:26:25.957886 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s" event={"ID":"8b79260a-a276-45fa-abfd-5d471f82142a","Type":"ContainerDied","Data":"534b417fd1e88eb9523c675a6e2a8c0aa434e5a84f5b7c0002eb1e4222a85701"} Mar 13 09:26:26 crc kubenswrapper[4841]: I0313 09:26:26.969323 4841 generic.go:334] "Generic (PLEG): container finished" podID="48ce3356-a536-4896-91db-35ccc6f131e9" containerID="6d35824799da04711e82a01921e56955adfde3445c713e7332d17ca48b5dcb08" exitCode=0 Mar 13 09:26:26 crc kubenswrapper[4841]: I0313 09:26:26.969366 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dcmh8" event={"ID":"48ce3356-a536-4896-91db-35ccc6f131e9","Type":"ContainerDied","Data":"6d35824799da04711e82a01921e56955adfde3445c713e7332d17ca48b5dcb08"} Mar 13 09:26:27 crc kubenswrapper[4841]: I0313 09:26:27.271682 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s" Mar 13 09:26:27 crc kubenswrapper[4841]: I0313 09:26:27.329196 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b79260a-a276-45fa-abfd-5d471f82142a-bundle\") pod \"8b79260a-a276-45fa-abfd-5d471f82142a\" (UID: \"8b79260a-a276-45fa-abfd-5d471f82142a\") " Mar 13 09:26:27 crc kubenswrapper[4841]: I0313 09:26:27.329400 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnzmg\" (UniqueName: \"kubernetes.io/projected/8b79260a-a276-45fa-abfd-5d471f82142a-kube-api-access-dnzmg\") pod \"8b79260a-a276-45fa-abfd-5d471f82142a\" (UID: \"8b79260a-a276-45fa-abfd-5d471f82142a\") " Mar 13 09:26:27 crc kubenswrapper[4841]: I0313 09:26:27.329467 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b79260a-a276-45fa-abfd-5d471f82142a-util\") pod \"8b79260a-a276-45fa-abfd-5d471f82142a\" (UID: \"8b79260a-a276-45fa-abfd-5d471f82142a\") " Mar 13 09:26:27 crc kubenswrapper[4841]: I0313 09:26:27.331062 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b79260a-a276-45fa-abfd-5d471f82142a-bundle" (OuterVolumeSpecName: "bundle") pod "8b79260a-a276-45fa-abfd-5d471f82142a" (UID: "8b79260a-a276-45fa-abfd-5d471f82142a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:26:27 crc kubenswrapper[4841]: I0313 09:26:27.338513 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b79260a-a276-45fa-abfd-5d471f82142a-kube-api-access-dnzmg" (OuterVolumeSpecName: "kube-api-access-dnzmg") pod "8b79260a-a276-45fa-abfd-5d471f82142a" (UID: "8b79260a-a276-45fa-abfd-5d471f82142a"). InnerVolumeSpecName "kube-api-access-dnzmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:26:27 crc kubenswrapper[4841]: I0313 09:26:27.358931 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b79260a-a276-45fa-abfd-5d471f82142a-util" (OuterVolumeSpecName: "util") pod "8b79260a-a276-45fa-abfd-5d471f82142a" (UID: "8b79260a-a276-45fa-abfd-5d471f82142a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:26:27 crc kubenswrapper[4841]: I0313 09:26:27.430879 4841 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b79260a-a276-45fa-abfd-5d471f82142a-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:27 crc kubenswrapper[4841]: I0313 09:26:27.430930 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnzmg\" (UniqueName: \"kubernetes.io/projected/8b79260a-a276-45fa-abfd-5d471f82142a-kube-api-access-dnzmg\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:27 crc kubenswrapper[4841]: I0313 09:26:27.430952 4841 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b79260a-a276-45fa-abfd-5d471f82142a-util\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:27 crc kubenswrapper[4841]: I0313 09:26:27.981621 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s" event={"ID":"8b79260a-a276-45fa-abfd-5d471f82142a","Type":"ContainerDied","Data":"f5b66634b7c8ba2b382c34b7a463776e5cfa27bebc48e7539800ae07271c98dc"} Mar 13 09:26:27 crc kubenswrapper[4841]: I0313 09:26:27.982035 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5b66634b7c8ba2b382c34b7a463776e5cfa27bebc48e7539800ae07271c98dc" Mar 13 09:26:27 crc kubenswrapper[4841]: I0313 09:26:27.981654 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s" Mar 13 09:26:27 crc kubenswrapper[4841]: I0313 09:26:27.986903 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dcmh8" event={"ID":"48ce3356-a536-4896-91db-35ccc6f131e9","Type":"ContainerStarted","Data":"53511711589a6daf64efa5e9c9690231f44a74baa67120d769ea510835f7ee11"} Mar 13 09:26:28 crc kubenswrapper[4841]: I0313 09:26:28.021082 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dcmh8" podStartSLOduration=2.440838933 podStartE2EDuration="5.021058489s" podCreationTimestamp="2026-03-13 09:26:23 +0000 UTC" firstStartedPulling="2026-03-13 09:26:24.947202179 +0000 UTC m=+867.677102380" lastFinishedPulling="2026-03-13 09:26:27.527421725 +0000 UTC m=+870.257321936" observedRunningTime="2026-03-13 09:26:28.014612498 +0000 UTC m=+870.744512689" watchObservedRunningTime="2026-03-13 09:26:28.021058489 +0000 UTC m=+870.750958720" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.290220 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hfkvb"] Mar 13 09:26:33 crc kubenswrapper[4841]: E0313 09:26:33.291943 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b79260a-a276-45fa-abfd-5d471f82142a" containerName="extract" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.291973 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b79260a-a276-45fa-abfd-5d471f82142a" containerName="extract" Mar 13 09:26:33 crc kubenswrapper[4841]: E0313 09:26:33.291994 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b79260a-a276-45fa-abfd-5d471f82142a" containerName="pull" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.292002 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b79260a-a276-45fa-abfd-5d471f82142a" containerName="pull" Mar 13 09:26:33 crc kubenswrapper[4841]: E0313 09:26:33.292035 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b79260a-a276-45fa-abfd-5d471f82142a" containerName="util" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.292043 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b79260a-a276-45fa-abfd-5d471f82142a" containerName="util" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.292545 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b79260a-a276-45fa-abfd-5d471f82142a" containerName="extract" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.294518 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hfkvb" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.303248 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hfkvb"] Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.417895 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vvq9\" (UniqueName: \"kubernetes.io/projected/5d9cbbd3-aa3e-470f-a172-e746431bd273-kube-api-access-4vvq9\") pod \"certified-operators-hfkvb\" (UID: \"5d9cbbd3-aa3e-470f-a172-e746431bd273\") " pod="openshift-marketplace/certified-operators-hfkvb" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.418025 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d9cbbd3-aa3e-470f-a172-e746431bd273-catalog-content\") pod \"certified-operators-hfkvb\" (UID: \"5d9cbbd3-aa3e-470f-a172-e746431bd273\") " pod="openshift-marketplace/certified-operators-hfkvb" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.418064 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d9cbbd3-aa3e-470f-a172-e746431bd273-utilities\") pod \"certified-operators-hfkvb\" (UID: \"5d9cbbd3-aa3e-470f-a172-e746431bd273\") " pod="openshift-marketplace/certified-operators-hfkvb" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.519744 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d9cbbd3-aa3e-470f-a172-e746431bd273-catalog-content\") pod \"certified-operators-hfkvb\" (UID: \"5d9cbbd3-aa3e-470f-a172-e746431bd273\") " pod="openshift-marketplace/certified-operators-hfkvb" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.519796 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d9cbbd3-aa3e-470f-a172-e746431bd273-utilities\") pod \"certified-operators-hfkvb\" (UID: \"5d9cbbd3-aa3e-470f-a172-e746431bd273\") " pod="openshift-marketplace/certified-operators-hfkvb" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.519835 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vvq9\" (UniqueName: \"kubernetes.io/projected/5d9cbbd3-aa3e-470f-a172-e746431bd273-kube-api-access-4vvq9\") pod \"certified-operators-hfkvb\" (UID: \"5d9cbbd3-aa3e-470f-a172-e746431bd273\") " pod="openshift-marketplace/certified-operators-hfkvb" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.520375 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d9cbbd3-aa3e-470f-a172-e746431bd273-utilities\") pod \"certified-operators-hfkvb\" (UID: \"5d9cbbd3-aa3e-470f-a172-e746431bd273\") " pod="openshift-marketplace/certified-operators-hfkvb" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.520375 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d9cbbd3-aa3e-470f-a172-e746431bd273-catalog-content\") pod \"certified-operators-hfkvb\" (UID: \"5d9cbbd3-aa3e-470f-a172-e746431bd273\") " pod="openshift-marketplace/certified-operators-hfkvb" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.537050 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vvq9\" (UniqueName: \"kubernetes.io/projected/5d9cbbd3-aa3e-470f-a172-e746431bd273-kube-api-access-4vvq9\") pod \"certified-operators-hfkvb\" (UID: \"5d9cbbd3-aa3e-470f-a172-e746431bd273\") " pod="openshift-marketplace/certified-operators-hfkvb" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.613609 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dcmh8" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.613662 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dcmh8" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.614200 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hfkvb" Mar 13 09:26:33 crc kubenswrapper[4841]: I0313 09:26:33.815675 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hfkvb"] Mar 13 09:26:33 crc kubenswrapper[4841]: W0313 09:26:33.822368 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d9cbbd3_aa3e_470f_a172_e746431bd273.slice/crio-d000ad43c42a1563f33c1e2613111021734033c467835374dbb57a312225015e WatchSource:0}: Error finding container d000ad43c42a1563f33c1e2613111021734033c467835374dbb57a312225015e: Status 404 returned error can't find the container with id d000ad43c42a1563f33c1e2613111021734033c467835374dbb57a312225015e Mar 13 09:26:34 crc kubenswrapper[4841]: I0313 09:26:34.019706 4841 generic.go:334] "Generic (PLEG): container finished" podID="5d9cbbd3-aa3e-470f-a172-e746431bd273" containerID="faca21ff51265da71fe2dbff9eb88912e3c24155909f0da1b3d3dd830188b897" exitCode=0 Mar 13 09:26:34 crc kubenswrapper[4841]: I0313 09:26:34.019747 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfkvb" event={"ID":"5d9cbbd3-aa3e-470f-a172-e746431bd273","Type":"ContainerDied","Data":"faca21ff51265da71fe2dbff9eb88912e3c24155909f0da1b3d3dd830188b897"} Mar 13 09:26:34 crc kubenswrapper[4841]: I0313 09:26:34.019774 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfkvb" event={"ID":"5d9cbbd3-aa3e-470f-a172-e746431bd273","Type":"ContainerStarted","Data":"d000ad43c42a1563f33c1e2613111021734033c467835374dbb57a312225015e"} Mar 13 09:26:34 crc kubenswrapper[4841]: I0313 09:26:34.654352 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dcmh8" podUID="48ce3356-a536-4896-91db-35ccc6f131e9" containerName="registry-server" probeResult="failure" output=< Mar 13 09:26:34 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Mar 13 09:26:34 crc kubenswrapper[4841]: > Mar 13 09:26:35 crc kubenswrapper[4841]: I0313 09:26:35.027487 4841 generic.go:334] "Generic (PLEG): container finished" podID="5d9cbbd3-aa3e-470f-a172-e746431bd273" containerID="c3331aeeb250406be51074f65f707d6483d9b81d67397889f04a7c6dcfa4cfaf" exitCode=0 Mar 13 09:26:35 crc kubenswrapper[4841]: I0313 09:26:35.027528 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfkvb" event={"ID":"5d9cbbd3-aa3e-470f-a172-e746431bd273","Type":"ContainerDied","Data":"c3331aeeb250406be51074f65f707d6483d9b81d67397889f04a7c6dcfa4cfaf"} Mar 13 09:26:36 crc kubenswrapper[4841]: I0313 09:26:36.042334 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfkvb" event={"ID":"5d9cbbd3-aa3e-470f-a172-e746431bd273","Type":"ContainerStarted","Data":"bba1fa857cf975adf3a9b339aeb4d694708831c56111dcae6d96332b3ac0a9fb"} Mar 13 09:26:36 crc kubenswrapper[4841]: I0313 09:26:36.060470 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hfkvb" podStartSLOduration=1.617143752 podStartE2EDuration="3.060448962s" podCreationTimestamp="2026-03-13 09:26:33 +0000 UTC" firstStartedPulling="2026-03-13 09:26:34.020944135 +0000 UTC m=+876.750844326" lastFinishedPulling="2026-03-13 09:26:35.464249335 +0000 UTC m=+878.194149536" observedRunningTime="2026-03-13 09:26:36.05943319 +0000 UTC m=+878.789333391" watchObservedRunningTime="2026-03-13 09:26:36.060448962 +0000 UTC m=+878.790349173" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.128026 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk"] Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.128826 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk" Mar 13 09:26:37 crc kubenswrapper[4841]: W0313 09:26:37.131129 4841 reflector.go:561] object-"metallb-system"/"manager-account-dockercfg-d77cm": failed to list *v1.Secret: secrets "manager-account-dockercfg-d77cm" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 13 09:26:37 crc kubenswrapper[4841]: E0313 09:26:37.131153 4841 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"manager-account-dockercfg-d77cm\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"manager-account-dockercfg-d77cm\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 13 09:26:37 crc kubenswrapper[4841]: W0313 09:26:37.131192 4841 reflector.go:561] object-"metallb-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 13 09:26:37 crc kubenswrapper[4841]: E0313 09:26:37.131237 4841 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 13 09:26:37 crc kubenswrapper[4841]: W0313 09:26:37.131453 4841 reflector.go:561] object-"metallb-system"/"metallb-operator-webhook-server-cert": failed to list *v1.Secret: secrets "metallb-operator-webhook-server-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 13 09:26:37 crc kubenswrapper[4841]: E0313 09:26:37.131475 4841 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-operator-webhook-server-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-operator-webhook-server-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 13 09:26:37 crc kubenswrapper[4841]: W0313 09:26:37.132007 4841 reflector.go:561] object-"metallb-system"/"metallb-operator-controller-manager-service-cert": failed to list *v1.Secret: secrets "metallb-operator-controller-manager-service-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 13 09:26:37 crc kubenswrapper[4841]: E0313 09:26:37.132027 4841 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-operator-controller-manager-service-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-operator-controller-manager-service-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 13 09:26:37 crc kubenswrapper[4841]: W0313 09:26:37.135666 4841 reflector.go:561] object-"metallb-system"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 13 09:26:37 crc kubenswrapper[4841]: E0313 09:26:37.135690 4841 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.153904 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk"] Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.170398 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm248\" (UniqueName: \"kubernetes.io/projected/683811db-740f-4604-b93b-c8134590a46a-kube-api-access-xm248\") pod \"metallb-operator-controller-manager-6d6c4d5946-gtbzk\" (UID: \"683811db-740f-4604-b93b-c8134590a46a\") " pod="metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.170488 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/683811db-740f-4604-b93b-c8134590a46a-apiservice-cert\") pod \"metallb-operator-controller-manager-6d6c4d5946-gtbzk\" (UID: \"683811db-740f-4604-b93b-c8134590a46a\") " pod="metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.170560 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/683811db-740f-4604-b93b-c8134590a46a-webhook-cert\") pod \"metallb-operator-controller-manager-6d6c4d5946-gtbzk\" (UID: \"683811db-740f-4604-b93b-c8134590a46a\") " pod="metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.282514 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/683811db-740f-4604-b93b-c8134590a46a-apiservice-cert\") pod \"metallb-operator-controller-manager-6d6c4d5946-gtbzk\" (UID: \"683811db-740f-4604-b93b-c8134590a46a\") " pod="metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.282734 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/683811db-740f-4604-b93b-c8134590a46a-webhook-cert\") pod \"metallb-operator-controller-manager-6d6c4d5946-gtbzk\" (UID: \"683811db-740f-4604-b93b-c8134590a46a\") " pod="metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.286529 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm248\" (UniqueName: \"kubernetes.io/projected/683811db-740f-4604-b93b-c8134590a46a-kube-api-access-xm248\") pod \"metallb-operator-controller-manager-6d6c4d5946-gtbzk\" (UID: \"683811db-740f-4604-b93b-c8134590a46a\") " pod="metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.559467 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk"] Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.560157 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.563952 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-t25cv" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.564168 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.564510 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.581431 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk"] Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.590015 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7znd\" (UniqueName: \"kubernetes.io/projected/2735aa21-2a11-4909-988a-f2add6dae771-kube-api-access-m7znd\") pod \"metallb-operator-webhook-server-744cf67d4f-ldddk\" (UID: \"2735aa21-2a11-4909-988a-f2add6dae771\") " pod="metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.590083 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2735aa21-2a11-4909-988a-f2add6dae771-apiservice-cert\") pod \"metallb-operator-webhook-server-744cf67d4f-ldddk\" (UID: \"2735aa21-2a11-4909-988a-f2add6dae771\") " pod="metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.590102 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2735aa21-2a11-4909-988a-f2add6dae771-webhook-cert\") pod \"metallb-operator-webhook-server-744cf67d4f-ldddk\" (UID: \"2735aa21-2a11-4909-988a-f2add6dae771\") " pod="metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.692923 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7znd\" (UniqueName: \"kubernetes.io/projected/2735aa21-2a11-4909-988a-f2add6dae771-kube-api-access-m7znd\") pod \"metallb-operator-webhook-server-744cf67d4f-ldddk\" (UID: \"2735aa21-2a11-4909-988a-f2add6dae771\") " pod="metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.692987 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2735aa21-2a11-4909-988a-f2add6dae771-apiservice-cert\") pod \"metallb-operator-webhook-server-744cf67d4f-ldddk\" (UID: \"2735aa21-2a11-4909-988a-f2add6dae771\") " pod="metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.693006 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2735aa21-2a11-4909-988a-f2add6dae771-webhook-cert\") pod \"metallb-operator-webhook-server-744cf67d4f-ldddk\" (UID: \"2735aa21-2a11-4909-988a-f2add6dae771\") " pod="metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.698331 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2735aa21-2a11-4909-988a-f2add6dae771-apiservice-cert\") pod \"metallb-operator-webhook-server-744cf67d4f-ldddk\" (UID: \"2735aa21-2a11-4909-988a-f2add6dae771\") " pod="metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.700914 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2735aa21-2a11-4909-988a-f2add6dae771-webhook-cert\") pod \"metallb-operator-webhook-server-744cf67d4f-ldddk\" (UID: \"2735aa21-2a11-4909-988a-f2add6dae771\") " pod="metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.944106 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 13 09:26:37 crc kubenswrapper[4841]: I0313 09:26:37.999529 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 13 09:26:38 crc kubenswrapper[4841]: E0313 09:26:38.283351 4841 secret.go:188] Couldn't get secret metallb-system/metallb-operator-controller-manager-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 09:26:38 crc kubenswrapper[4841]: E0313 09:26:38.283442 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/683811db-740f-4604-b93b-c8134590a46a-apiservice-cert podName:683811db-740f-4604-b93b-c8134590a46a nodeName:}" failed. No retries permitted until 2026-03-13 09:26:38.783421303 +0000 UTC m=+881.513321514 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/683811db-740f-4604-b93b-c8134590a46a-apiservice-cert") pod "metallb-operator-controller-manager-6d6c4d5946-gtbzk" (UID: "683811db-740f-4604-b93b-c8134590a46a") : failed to sync secret cache: timed out waiting for the condition Mar 13 09:26:38 crc kubenswrapper[4841]: E0313 09:26:38.285606 4841 secret.go:188] Couldn't get secret metallb-system/metallb-operator-controller-manager-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 09:26:38 crc kubenswrapper[4841]: E0313 09:26:38.285732 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/683811db-740f-4604-b93b-c8134590a46a-webhook-cert podName:683811db-740f-4604-b93b-c8134590a46a nodeName:}" failed. No retries permitted until 2026-03-13 09:26:38.785704594 +0000 UTC m=+881.515604825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/683811db-740f-4604-b93b-c8134590a46a-webhook-cert") pod "metallb-operator-controller-manager-6d6c4d5946-gtbzk" (UID: "683811db-740f-4604-b93b-c8134590a46a") : failed to sync secret cache: timed out waiting for the condition Mar 13 09:26:38 crc kubenswrapper[4841]: I0313 09:26:38.321735 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-d77cm" Mar 13 09:26:38 crc kubenswrapper[4841]: I0313 09:26:38.322295 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 13 09:26:38 crc kubenswrapper[4841]: I0313 09:26:38.335929 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm248\" (UniqueName: \"kubernetes.io/projected/683811db-740f-4604-b93b-c8134590a46a-kube-api-access-xm248\") pod \"metallb-operator-controller-manager-6d6c4d5946-gtbzk\" (UID: \"683811db-740f-4604-b93b-c8134590a46a\") " pod="metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk" Mar 13 09:26:38 crc kubenswrapper[4841]: I0313 09:26:38.344183 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7znd\" (UniqueName: \"kubernetes.io/projected/2735aa21-2a11-4909-988a-f2add6dae771-kube-api-access-m7znd\") pod \"metallb-operator-webhook-server-744cf67d4f-ldddk\" (UID: \"2735aa21-2a11-4909-988a-f2add6dae771\") " pod="metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk" Mar 13 09:26:38 crc kubenswrapper[4841]: I0313 09:26:38.473538 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk" Mar 13 09:26:38 crc kubenswrapper[4841]: I0313 09:26:38.609637 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 13 09:26:38 crc kubenswrapper[4841]: I0313 09:26:38.711279 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk"] Mar 13 09:26:38 crc kubenswrapper[4841]: I0313 09:26:38.808047 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/683811db-740f-4604-b93b-c8134590a46a-webhook-cert\") pod \"metallb-operator-controller-manager-6d6c4d5946-gtbzk\" (UID: \"683811db-740f-4604-b93b-c8134590a46a\") " pod="metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk" Mar 13 09:26:38 crc kubenswrapper[4841]: I0313 09:26:38.808148 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/683811db-740f-4604-b93b-c8134590a46a-apiservice-cert\") pod \"metallb-operator-controller-manager-6d6c4d5946-gtbzk\" (UID: \"683811db-740f-4604-b93b-c8134590a46a\") " pod="metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk" Mar 13 09:26:38 crc kubenswrapper[4841]: I0313 09:26:38.813907 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/683811db-740f-4604-b93b-c8134590a46a-apiservice-cert\") pod \"metallb-operator-controller-manager-6d6c4d5946-gtbzk\" (UID: \"683811db-740f-4604-b93b-c8134590a46a\") " pod="metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk" Mar 13 09:26:38 crc kubenswrapper[4841]: I0313 09:26:38.813956 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/683811db-740f-4604-b93b-c8134590a46a-webhook-cert\") pod \"metallb-operator-controller-manager-6d6c4d5946-gtbzk\" (UID: \"683811db-740f-4604-b93b-c8134590a46a\") " pod="metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk" Mar 13 09:26:38 crc kubenswrapper[4841]: I0313 09:26:38.942734 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk" Mar 13 09:26:39 crc kubenswrapper[4841]: I0313 09:26:39.057788 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk" event={"ID":"2735aa21-2a11-4909-988a-f2add6dae771","Type":"ContainerStarted","Data":"d6f960dfba34c2daa3f16871d9382b456914b8eee3c1d2978353caa164aa367c"} Mar 13 09:26:39 crc kubenswrapper[4841]: I0313 09:26:39.413012 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk"] Mar 13 09:26:40 crc kubenswrapper[4841]: I0313 09:26:40.064121 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk" event={"ID":"683811db-740f-4604-b93b-c8134590a46a","Type":"ContainerStarted","Data":"ce1798fcbc3d730dfba051040505f85ee1b5e9d50c64a4648440cc7da33c66bc"} Mar 13 09:26:43 crc kubenswrapper[4841]: I0313 09:26:43.615245 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hfkvb" Mar 13 09:26:43 crc kubenswrapper[4841]: I0313 09:26:43.615630 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hfkvb" Mar 13 09:26:43 crc kubenswrapper[4841]: I0313 09:26:43.654115 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dcmh8" Mar 13 09:26:43 crc kubenswrapper[4841]: I0313 09:26:43.676919 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hfkvb" Mar 13 09:26:43 crc kubenswrapper[4841]: I0313 09:26:43.697538 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dcmh8" Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.108375 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk" event={"ID":"2735aa21-2a11-4909-988a-f2add6dae771","Type":"ContainerStarted","Data":"1a838d5015403fc526d855c9c7ecdaca244a601ebc9af92419b99bbb79de1c7b"} Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.108456 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk" Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.110794 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk" event={"ID":"683811db-740f-4604-b93b-c8134590a46a","Type":"ContainerStarted","Data":"b85fecd4f54e3bec53d9ec7d5cbf945d5b2cd623d685ebb4c81fd36b5803a24a"} Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.130026 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk" podStartSLOduration=1.946034809 podStartE2EDuration="7.130008948s" podCreationTimestamp="2026-03-13 09:26:37 +0000 UTC" firstStartedPulling="2026-03-13 09:26:38.725475714 +0000 UTC m=+881.455375915" lastFinishedPulling="2026-03-13 09:26:43.909449863 +0000 UTC m=+886.639350054" observedRunningTime="2026-03-13 09:26:44.128053557 +0000 UTC m=+886.857953748" watchObservedRunningTime="2026-03-13 09:26:44.130008948 +0000 UTC m=+886.859909139" Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.147040 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk" podStartSLOduration=2.660036473 podStartE2EDuration="7.147023911s" podCreationTimestamp="2026-03-13 09:26:37 +0000 UTC" firstStartedPulling="2026-03-13 09:26:39.418155851 +0000 UTC m=+882.148056052" lastFinishedPulling="2026-03-13 09:26:43.905143309 +0000 UTC m=+886.635043490" observedRunningTime="2026-03-13 09:26:44.145234646 +0000 UTC m=+886.875134837" watchObservedRunningTime="2026-03-13 09:26:44.147023911 +0000 UTC m=+886.876924102" Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.154738 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hfkvb" Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.477293 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t2mfz"] Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.478456 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t2mfz" Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.490288 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t2mfz"] Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.602084 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w27ml\" (UniqueName: \"kubernetes.io/projected/f7571cae-3083-455f-869a-f93947c7fb2c-kube-api-access-w27ml\") pod \"redhat-marketplace-t2mfz\" (UID: \"f7571cae-3083-455f-869a-f93947c7fb2c\") " pod="openshift-marketplace/redhat-marketplace-t2mfz" Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.602158 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7571cae-3083-455f-869a-f93947c7fb2c-utilities\") pod \"redhat-marketplace-t2mfz\" (UID: \"f7571cae-3083-455f-869a-f93947c7fb2c\") " pod="openshift-marketplace/redhat-marketplace-t2mfz" Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.602197 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7571cae-3083-455f-869a-f93947c7fb2c-catalog-content\") pod \"redhat-marketplace-t2mfz\" (UID: \"f7571cae-3083-455f-869a-f93947c7fb2c\") " pod="openshift-marketplace/redhat-marketplace-t2mfz" Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.704021 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w27ml\" (UniqueName: \"kubernetes.io/projected/f7571cae-3083-455f-869a-f93947c7fb2c-kube-api-access-w27ml\") pod \"redhat-marketplace-t2mfz\" (UID: \"f7571cae-3083-455f-869a-f93947c7fb2c\") " pod="openshift-marketplace/redhat-marketplace-t2mfz" Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.704115 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7571cae-3083-455f-869a-f93947c7fb2c-utilities\") pod \"redhat-marketplace-t2mfz\" (UID: \"f7571cae-3083-455f-869a-f93947c7fb2c\") " pod="openshift-marketplace/redhat-marketplace-t2mfz" Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.704160 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7571cae-3083-455f-869a-f93947c7fb2c-catalog-content\") pod \"redhat-marketplace-t2mfz\" (UID: \"f7571cae-3083-455f-869a-f93947c7fb2c\") " pod="openshift-marketplace/redhat-marketplace-t2mfz" Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.704674 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7571cae-3083-455f-869a-f93947c7fb2c-catalog-content\") pod \"redhat-marketplace-t2mfz\" (UID: \"f7571cae-3083-455f-869a-f93947c7fb2c\") " pod="openshift-marketplace/redhat-marketplace-t2mfz" Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.704737 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7571cae-3083-455f-869a-f93947c7fb2c-utilities\") pod \"redhat-marketplace-t2mfz\" (UID: \"f7571cae-3083-455f-869a-f93947c7fb2c\") " pod="openshift-marketplace/redhat-marketplace-t2mfz" Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.723512 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w27ml\" (UniqueName: \"kubernetes.io/projected/f7571cae-3083-455f-869a-f93947c7fb2c-kube-api-access-w27ml\") pod \"redhat-marketplace-t2mfz\" (UID: \"f7571cae-3083-455f-869a-f93947c7fb2c\") " pod="openshift-marketplace/redhat-marketplace-t2mfz" Mar 13 09:26:44 crc kubenswrapper[4841]: I0313 09:26:44.805757 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t2mfz" Mar 13 09:26:45 crc kubenswrapper[4841]: I0313 09:26:45.116432 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk" Mar 13 09:26:45 crc kubenswrapper[4841]: I0313 09:26:45.227791 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t2mfz"] Mar 13 09:26:46 crc kubenswrapper[4841]: I0313 09:26:46.123351 4841 generic.go:334] "Generic (PLEG): container finished" podID="f7571cae-3083-455f-869a-f93947c7fb2c" containerID="69bdc91950a0e5084c2ebf2f0aa1c0510b203a99022547551b46fa8bd179c8cf" exitCode=0 Mar 13 09:26:46 crc kubenswrapper[4841]: I0313 09:26:46.123485 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2mfz" event={"ID":"f7571cae-3083-455f-869a-f93947c7fb2c","Type":"ContainerDied","Data":"69bdc91950a0e5084c2ebf2f0aa1c0510b203a99022547551b46fa8bd179c8cf"} Mar 13 09:26:46 crc kubenswrapper[4841]: I0313 09:26:46.123882 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2mfz" event={"ID":"f7571cae-3083-455f-869a-f93947c7fb2c","Type":"ContainerStarted","Data":"ba18081f2121bdb5b526c75fa77ae1e2256ce93cde80eef99102b4a4a6c6c977"} Mar 13 09:26:47 crc kubenswrapper[4841]: I0313 09:26:47.131680 4841 generic.go:334] "Generic (PLEG): container finished" podID="f7571cae-3083-455f-869a-f93947c7fb2c" containerID="b8ca1d383ec30142318ba8c6b7e70cf3c4d9d39496f99bf22618a4e48cbfa6eb" exitCode=0 Mar 13 09:26:47 crc kubenswrapper[4841]: I0313 09:26:47.131835 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2mfz" event={"ID":"f7571cae-3083-455f-869a-f93947c7fb2c","Type":"ContainerDied","Data":"b8ca1d383ec30142318ba8c6b7e70cf3c4d9d39496f99bf22618a4e48cbfa6eb"} Mar 13 09:26:47 crc kubenswrapper[4841]: I0313 09:26:47.277925 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dcmh8"] Mar 13 09:26:47 crc kubenswrapper[4841]: I0313 09:26:47.278260 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dcmh8" podUID="48ce3356-a536-4896-91db-35ccc6f131e9" containerName="registry-server" containerID="cri-o://53511711589a6daf64efa5e9c9690231f44a74baa67120d769ea510835f7ee11" gracePeriod=2 Mar 13 09:26:47 crc kubenswrapper[4841]: I0313 09:26:47.693253 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dcmh8" Mar 13 09:26:47 crc kubenswrapper[4841]: I0313 09:26:47.844942 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48ce3356-a536-4896-91db-35ccc6f131e9-catalog-content\") pod \"48ce3356-a536-4896-91db-35ccc6f131e9\" (UID: \"48ce3356-a536-4896-91db-35ccc6f131e9\") " Mar 13 09:26:47 crc kubenswrapper[4841]: I0313 09:26:47.846483 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48ce3356-a536-4896-91db-35ccc6f131e9-utilities\") pod \"48ce3356-a536-4896-91db-35ccc6f131e9\" (UID: \"48ce3356-a536-4896-91db-35ccc6f131e9\") " Mar 13 09:26:47 crc kubenswrapper[4841]: I0313 09:26:47.846650 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hkdf\" (UniqueName: \"kubernetes.io/projected/48ce3356-a536-4896-91db-35ccc6f131e9-kube-api-access-8hkdf\") pod \"48ce3356-a536-4896-91db-35ccc6f131e9\" (UID: \"48ce3356-a536-4896-91db-35ccc6f131e9\") " Mar 13 09:26:47 crc kubenswrapper[4841]: I0313 09:26:47.847046 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48ce3356-a536-4896-91db-35ccc6f131e9-utilities" (OuterVolumeSpecName: "utilities") pod "48ce3356-a536-4896-91db-35ccc6f131e9" (UID: "48ce3356-a536-4896-91db-35ccc6f131e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:26:47 crc kubenswrapper[4841]: I0313 09:26:47.847190 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48ce3356-a536-4896-91db-35ccc6f131e9-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:47 crc kubenswrapper[4841]: I0313 09:26:47.858323 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48ce3356-a536-4896-91db-35ccc6f131e9-kube-api-access-8hkdf" (OuterVolumeSpecName: "kube-api-access-8hkdf") pod "48ce3356-a536-4896-91db-35ccc6f131e9" (UID: "48ce3356-a536-4896-91db-35ccc6f131e9"). InnerVolumeSpecName "kube-api-access-8hkdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:26:47 crc kubenswrapper[4841]: I0313 09:26:47.948508 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hkdf\" (UniqueName: \"kubernetes.io/projected/48ce3356-a536-4896-91db-35ccc6f131e9-kube-api-access-8hkdf\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:47 crc kubenswrapper[4841]: I0313 09:26:47.952126 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48ce3356-a536-4896-91db-35ccc6f131e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48ce3356-a536-4896-91db-35ccc6f131e9" (UID: "48ce3356-a536-4896-91db-35ccc6f131e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.050238 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48ce3356-a536-4896-91db-35ccc6f131e9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.069684 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hfkvb"] Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.069940 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hfkvb" podUID="5d9cbbd3-aa3e-470f-a172-e746431bd273" containerName="registry-server" containerID="cri-o://bba1fa857cf975adf3a9b339aeb4d694708831c56111dcae6d96332b3ac0a9fb" gracePeriod=2 Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.139426 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2mfz" event={"ID":"f7571cae-3083-455f-869a-f93947c7fb2c","Type":"ContainerStarted","Data":"a54a2c1acc61b139a5609b812062a68a65ca2006cb387ca9702c6d1c9568f9b8"} Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.142099 4841 generic.go:334] "Generic (PLEG): container finished" podID="48ce3356-a536-4896-91db-35ccc6f131e9" containerID="53511711589a6daf64efa5e9c9690231f44a74baa67120d769ea510835f7ee11" exitCode=0 Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.142134 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dcmh8" event={"ID":"48ce3356-a536-4896-91db-35ccc6f131e9","Type":"ContainerDied","Data":"53511711589a6daf64efa5e9c9690231f44a74baa67120d769ea510835f7ee11"} Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.142154 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dcmh8" event={"ID":"48ce3356-a536-4896-91db-35ccc6f131e9","Type":"ContainerDied","Data":"386f2ea13b748d98939436be7c172abdb0290671ba68bf5e3b293d20dd1d15b6"} Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.142183 4841 scope.go:117] "RemoveContainer" containerID="53511711589a6daf64efa5e9c9690231f44a74baa67120d769ea510835f7ee11" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.142214 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dcmh8" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.159654 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t2mfz" podStartSLOduration=2.749525836 podStartE2EDuration="4.159636316s" podCreationTimestamp="2026-03-13 09:26:44 +0000 UTC" firstStartedPulling="2026-03-13 09:26:46.124738704 +0000 UTC m=+888.854638935" lastFinishedPulling="2026-03-13 09:26:47.534849224 +0000 UTC m=+890.264749415" observedRunningTime="2026-03-13 09:26:48.156966633 +0000 UTC m=+890.886866864" watchObservedRunningTime="2026-03-13 09:26:48.159636316 +0000 UTC m=+890.889536507" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.200326 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dcmh8"] Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.207232 4841 scope.go:117] "RemoveContainer" containerID="6d35824799da04711e82a01921e56955adfde3445c713e7332d17ca48b5dcb08" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.208279 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dcmh8"] Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.247197 4841 scope.go:117] "RemoveContainer" containerID="3421de706ca9b0cd7e0e4db17f195b2e7ccc978765b2a70b14e473ff792788aa" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.267043 4841 scope.go:117] "RemoveContainer" containerID="53511711589a6daf64efa5e9c9690231f44a74baa67120d769ea510835f7ee11" Mar 13 09:26:48 crc kubenswrapper[4841]: E0313 09:26:48.267513 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53511711589a6daf64efa5e9c9690231f44a74baa67120d769ea510835f7ee11\": container with ID starting with 53511711589a6daf64efa5e9c9690231f44a74baa67120d769ea510835f7ee11 not found: ID does not exist" containerID="53511711589a6daf64efa5e9c9690231f44a74baa67120d769ea510835f7ee11" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.267553 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53511711589a6daf64efa5e9c9690231f44a74baa67120d769ea510835f7ee11"} err="failed to get container status \"53511711589a6daf64efa5e9c9690231f44a74baa67120d769ea510835f7ee11\": rpc error: code = NotFound desc = could not find container \"53511711589a6daf64efa5e9c9690231f44a74baa67120d769ea510835f7ee11\": container with ID starting with 53511711589a6daf64efa5e9c9690231f44a74baa67120d769ea510835f7ee11 not found: ID does not exist" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.267580 4841 scope.go:117] "RemoveContainer" containerID="6d35824799da04711e82a01921e56955adfde3445c713e7332d17ca48b5dcb08" Mar 13 09:26:48 crc kubenswrapper[4841]: E0313 09:26:48.267824 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d35824799da04711e82a01921e56955adfde3445c713e7332d17ca48b5dcb08\": container with ID starting with 6d35824799da04711e82a01921e56955adfde3445c713e7332d17ca48b5dcb08 not found: ID does not exist" containerID="6d35824799da04711e82a01921e56955adfde3445c713e7332d17ca48b5dcb08" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.267844 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d35824799da04711e82a01921e56955adfde3445c713e7332d17ca48b5dcb08"} err="failed to get container status \"6d35824799da04711e82a01921e56955adfde3445c713e7332d17ca48b5dcb08\": rpc error: code = NotFound desc = could not find container \"6d35824799da04711e82a01921e56955adfde3445c713e7332d17ca48b5dcb08\": container with ID starting with 6d35824799da04711e82a01921e56955adfde3445c713e7332d17ca48b5dcb08 not found: ID does not exist" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.267857 4841 scope.go:117] "RemoveContainer" containerID="3421de706ca9b0cd7e0e4db17f195b2e7ccc978765b2a70b14e473ff792788aa" Mar 13 09:26:48 crc kubenswrapper[4841]: E0313 09:26:48.268053 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3421de706ca9b0cd7e0e4db17f195b2e7ccc978765b2a70b14e473ff792788aa\": container with ID starting with 3421de706ca9b0cd7e0e4db17f195b2e7ccc978765b2a70b14e473ff792788aa not found: ID does not exist" containerID="3421de706ca9b0cd7e0e4db17f195b2e7ccc978765b2a70b14e473ff792788aa" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.268085 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3421de706ca9b0cd7e0e4db17f195b2e7ccc978765b2a70b14e473ff792788aa"} err="failed to get container status \"3421de706ca9b0cd7e0e4db17f195b2e7ccc978765b2a70b14e473ff792788aa\": rpc error: code = NotFound desc = could not find container \"3421de706ca9b0cd7e0e4db17f195b2e7ccc978765b2a70b14e473ff792788aa\": container with ID starting with 3421de706ca9b0cd7e0e4db17f195b2e7ccc978765b2a70b14e473ff792788aa not found: ID does not exist" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.504019 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hfkvb" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.656843 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d9cbbd3-aa3e-470f-a172-e746431bd273-catalog-content\") pod \"5d9cbbd3-aa3e-470f-a172-e746431bd273\" (UID: \"5d9cbbd3-aa3e-470f-a172-e746431bd273\") " Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.656888 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d9cbbd3-aa3e-470f-a172-e746431bd273-utilities\") pod \"5d9cbbd3-aa3e-470f-a172-e746431bd273\" (UID: \"5d9cbbd3-aa3e-470f-a172-e746431bd273\") " Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.656966 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vvq9\" (UniqueName: \"kubernetes.io/projected/5d9cbbd3-aa3e-470f-a172-e746431bd273-kube-api-access-4vvq9\") pod \"5d9cbbd3-aa3e-470f-a172-e746431bd273\" (UID: \"5d9cbbd3-aa3e-470f-a172-e746431bd273\") " Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.658481 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d9cbbd3-aa3e-470f-a172-e746431bd273-utilities" (OuterVolumeSpecName: "utilities") pod "5d9cbbd3-aa3e-470f-a172-e746431bd273" (UID: "5d9cbbd3-aa3e-470f-a172-e746431bd273"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.661678 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d9cbbd3-aa3e-470f-a172-e746431bd273-kube-api-access-4vvq9" (OuterVolumeSpecName: "kube-api-access-4vvq9") pod "5d9cbbd3-aa3e-470f-a172-e746431bd273" (UID: "5d9cbbd3-aa3e-470f-a172-e746431bd273"). InnerVolumeSpecName "kube-api-access-4vvq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.708419 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d9cbbd3-aa3e-470f-a172-e746431bd273-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d9cbbd3-aa3e-470f-a172-e746431bd273" (UID: "5d9cbbd3-aa3e-470f-a172-e746431bd273"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.757879 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d9cbbd3-aa3e-470f-a172-e746431bd273-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.757914 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d9cbbd3-aa3e-470f-a172-e746431bd273-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:48 crc kubenswrapper[4841]: I0313 09:26:48.757924 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vvq9\" (UniqueName: \"kubernetes.io/projected/5d9cbbd3-aa3e-470f-a172-e746431bd273-kube-api-access-4vvq9\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:49 crc kubenswrapper[4841]: I0313 09:26:49.152465 4841 generic.go:334] "Generic (PLEG): container finished" podID="5d9cbbd3-aa3e-470f-a172-e746431bd273" containerID="bba1fa857cf975adf3a9b339aeb4d694708831c56111dcae6d96332b3ac0a9fb" exitCode=0 Mar 13 09:26:49 crc kubenswrapper[4841]: I0313 09:26:49.152548 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hfkvb" Mar 13 09:26:49 crc kubenswrapper[4841]: I0313 09:26:49.152544 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfkvb" event={"ID":"5d9cbbd3-aa3e-470f-a172-e746431bd273","Type":"ContainerDied","Data":"bba1fa857cf975adf3a9b339aeb4d694708831c56111dcae6d96332b3ac0a9fb"} Mar 13 09:26:49 crc kubenswrapper[4841]: I0313 09:26:49.152950 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfkvb" event={"ID":"5d9cbbd3-aa3e-470f-a172-e746431bd273","Type":"ContainerDied","Data":"d000ad43c42a1563f33c1e2613111021734033c467835374dbb57a312225015e"} Mar 13 09:26:49 crc kubenswrapper[4841]: I0313 09:26:49.152993 4841 scope.go:117] "RemoveContainer" containerID="bba1fa857cf975adf3a9b339aeb4d694708831c56111dcae6d96332b3ac0a9fb" Mar 13 09:26:49 crc kubenswrapper[4841]: I0313 09:26:49.172698 4841 scope.go:117] "RemoveContainer" containerID="c3331aeeb250406be51074f65f707d6483d9b81d67397889f04a7c6dcfa4cfaf" Mar 13 09:26:49 crc kubenswrapper[4841]: I0313 09:26:49.198442 4841 scope.go:117] "RemoveContainer" containerID="faca21ff51265da71fe2dbff9eb88912e3c24155909f0da1b3d3dd830188b897" Mar 13 09:26:49 crc kubenswrapper[4841]: I0313 09:26:49.205281 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hfkvb"] Mar 13 09:26:49 crc kubenswrapper[4841]: I0313 09:26:49.212061 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hfkvb"] Mar 13 09:26:49 crc kubenswrapper[4841]: I0313 09:26:49.224046 4841 scope.go:117] "RemoveContainer" containerID="bba1fa857cf975adf3a9b339aeb4d694708831c56111dcae6d96332b3ac0a9fb" Mar 13 09:26:49 crc kubenswrapper[4841]: E0313 09:26:49.224674 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba1fa857cf975adf3a9b339aeb4d694708831c56111dcae6d96332b3ac0a9fb\": container with ID starting with bba1fa857cf975adf3a9b339aeb4d694708831c56111dcae6d96332b3ac0a9fb not found: ID does not exist" containerID="bba1fa857cf975adf3a9b339aeb4d694708831c56111dcae6d96332b3ac0a9fb" Mar 13 09:26:49 crc kubenswrapper[4841]: I0313 09:26:49.224736 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba1fa857cf975adf3a9b339aeb4d694708831c56111dcae6d96332b3ac0a9fb"} err="failed to get container status \"bba1fa857cf975adf3a9b339aeb4d694708831c56111dcae6d96332b3ac0a9fb\": rpc error: code = NotFound desc = could not find container \"bba1fa857cf975adf3a9b339aeb4d694708831c56111dcae6d96332b3ac0a9fb\": container with ID starting with bba1fa857cf975adf3a9b339aeb4d694708831c56111dcae6d96332b3ac0a9fb not found: ID does not exist" Mar 13 09:26:49 crc kubenswrapper[4841]: I0313 09:26:49.224776 4841 scope.go:117] "RemoveContainer" containerID="c3331aeeb250406be51074f65f707d6483d9b81d67397889f04a7c6dcfa4cfaf" Mar 13 09:26:49 crc kubenswrapper[4841]: E0313 09:26:49.225203 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3331aeeb250406be51074f65f707d6483d9b81d67397889f04a7c6dcfa4cfaf\": container with ID starting with c3331aeeb250406be51074f65f707d6483d9b81d67397889f04a7c6dcfa4cfaf not found: ID does not exist" containerID="c3331aeeb250406be51074f65f707d6483d9b81d67397889f04a7c6dcfa4cfaf" Mar 13 09:26:49 crc kubenswrapper[4841]: I0313 09:26:49.225354 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3331aeeb250406be51074f65f707d6483d9b81d67397889f04a7c6dcfa4cfaf"} err="failed to get container status \"c3331aeeb250406be51074f65f707d6483d9b81d67397889f04a7c6dcfa4cfaf\": rpc error: code = NotFound desc = could not find container \"c3331aeeb250406be51074f65f707d6483d9b81d67397889f04a7c6dcfa4cfaf\": container with ID starting with c3331aeeb250406be51074f65f707d6483d9b81d67397889f04a7c6dcfa4cfaf not found: ID does not exist" Mar 13 09:26:49 crc kubenswrapper[4841]: I0313 09:26:49.225453 4841 scope.go:117] "RemoveContainer" containerID="faca21ff51265da71fe2dbff9eb88912e3c24155909f0da1b3d3dd830188b897" Mar 13 09:26:49 crc kubenswrapper[4841]: E0313 09:26:49.225957 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faca21ff51265da71fe2dbff9eb88912e3c24155909f0da1b3d3dd830188b897\": container with ID starting with faca21ff51265da71fe2dbff9eb88912e3c24155909f0da1b3d3dd830188b897 not found: ID does not exist" containerID="faca21ff51265da71fe2dbff9eb88912e3c24155909f0da1b3d3dd830188b897" Mar 13 09:26:49 crc kubenswrapper[4841]: I0313 09:26:49.226029 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faca21ff51265da71fe2dbff9eb88912e3c24155909f0da1b3d3dd830188b897"} err="failed to get container status \"faca21ff51265da71fe2dbff9eb88912e3c24155909f0da1b3d3dd830188b897\": rpc error: code = NotFound desc = could not find container \"faca21ff51265da71fe2dbff9eb88912e3c24155909f0da1b3d3dd830188b897\": container with ID starting with faca21ff51265da71fe2dbff9eb88912e3c24155909f0da1b3d3dd830188b897 not found: ID does not exist" Mar 13 09:26:50 crc kubenswrapper[4841]: I0313 09:26:50.002330 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48ce3356-a536-4896-91db-35ccc6f131e9" path="/var/lib/kubelet/pods/48ce3356-a536-4896-91db-35ccc6f131e9/volumes" Mar 13 09:26:50 crc kubenswrapper[4841]: I0313 09:26:50.002939 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d9cbbd3-aa3e-470f-a172-e746431bd273" path="/var/lib/kubelet/pods/5d9cbbd3-aa3e-470f-a172-e746431bd273/volumes" Mar 13 09:26:54 crc kubenswrapper[4841]: I0313 09:26:54.806687 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t2mfz" Mar 13 09:26:54 crc kubenswrapper[4841]: I0313 09:26:54.807139 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t2mfz" Mar 13 09:26:54 crc kubenswrapper[4841]: I0313 09:26:54.852154 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t2mfz" Mar 13 09:26:55 crc kubenswrapper[4841]: I0313 09:26:55.237436 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t2mfz" Mar 13 09:26:58 crc kubenswrapper[4841]: I0313 09:26:58.469779 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t2mfz"] Mar 13 09:26:58 crc kubenswrapper[4841]: I0313 09:26:58.470288 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t2mfz" podUID="f7571cae-3083-455f-869a-f93947c7fb2c" containerName="registry-server" containerID="cri-o://a54a2c1acc61b139a5609b812062a68a65ca2006cb387ca9702c6d1c9568f9b8" gracePeriod=2 Mar 13 09:26:58 crc kubenswrapper[4841]: I0313 09:26:58.479857 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-744cf67d4f-ldddk" Mar 13 09:26:58 crc kubenswrapper[4841]: I0313 09:26:58.641378 4841 scope.go:117] "RemoveContainer" containerID="3116b438849dc4c367e4eb6711fa0f8245afc382af2260ca5d8c441030e2d9a5" Mar 13 09:26:58 crc kubenswrapper[4841]: I0313 09:26:58.846124 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t2mfz" Mar 13 09:26:58 crc kubenswrapper[4841]: I0313 09:26:58.993264 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w27ml\" (UniqueName: \"kubernetes.io/projected/f7571cae-3083-455f-869a-f93947c7fb2c-kube-api-access-w27ml\") pod \"f7571cae-3083-455f-869a-f93947c7fb2c\" (UID: \"f7571cae-3083-455f-869a-f93947c7fb2c\") " Mar 13 09:26:58 crc kubenswrapper[4841]: I0313 09:26:58.993396 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7571cae-3083-455f-869a-f93947c7fb2c-catalog-content\") pod \"f7571cae-3083-455f-869a-f93947c7fb2c\" (UID: \"f7571cae-3083-455f-869a-f93947c7fb2c\") " Mar 13 09:26:58 crc kubenswrapper[4841]: I0313 09:26:58.993468 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7571cae-3083-455f-869a-f93947c7fb2c-utilities\") pod \"f7571cae-3083-455f-869a-f93947c7fb2c\" (UID: \"f7571cae-3083-455f-869a-f93947c7fb2c\") " Mar 13 09:26:58 crc kubenswrapper[4841]: I0313 09:26:58.994301 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7571cae-3083-455f-869a-f93947c7fb2c-utilities" (OuterVolumeSpecName: "utilities") pod "f7571cae-3083-455f-869a-f93947c7fb2c" (UID: "f7571cae-3083-455f-869a-f93947c7fb2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:26:58 crc kubenswrapper[4841]: I0313 09:26:58.999417 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7571cae-3083-455f-869a-f93947c7fb2c-kube-api-access-w27ml" (OuterVolumeSpecName: "kube-api-access-w27ml") pod "f7571cae-3083-455f-869a-f93947c7fb2c" (UID: "f7571cae-3083-455f-869a-f93947c7fb2c"). InnerVolumeSpecName "kube-api-access-w27ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.029343 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7571cae-3083-455f-869a-f93947c7fb2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7571cae-3083-455f-869a-f93947c7fb2c" (UID: "f7571cae-3083-455f-869a-f93947c7fb2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.094981 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7571cae-3083-455f-869a-f93947c7fb2c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.095018 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7571cae-3083-455f-869a-f93947c7fb2c-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.095031 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w27ml\" (UniqueName: \"kubernetes.io/projected/f7571cae-3083-455f-869a-f93947c7fb2c-kube-api-access-w27ml\") on node \"crc\" DevicePath \"\"" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.225740 4841 generic.go:334] "Generic (PLEG): container finished" podID="f7571cae-3083-455f-869a-f93947c7fb2c" containerID="a54a2c1acc61b139a5609b812062a68a65ca2006cb387ca9702c6d1c9568f9b8" exitCode=0 Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.225780 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2mfz" event={"ID":"f7571cae-3083-455f-869a-f93947c7fb2c","Type":"ContainerDied","Data":"a54a2c1acc61b139a5609b812062a68a65ca2006cb387ca9702c6d1c9568f9b8"} Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.225805 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2mfz" event={"ID":"f7571cae-3083-455f-869a-f93947c7fb2c","Type":"ContainerDied","Data":"ba18081f2121bdb5b526c75fa77ae1e2256ce93cde80eef99102b4a4a6c6c977"} Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.225821 4841 scope.go:117] "RemoveContainer" containerID="a54a2c1acc61b139a5609b812062a68a65ca2006cb387ca9702c6d1c9568f9b8" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.225932 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t2mfz" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.250336 4841 scope.go:117] "RemoveContainer" containerID="b8ca1d383ec30142318ba8c6b7e70cf3c4d9d39496f99bf22618a4e48cbfa6eb" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.269271 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t2mfz"] Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.276745 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t2mfz"] Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.277955 4841 scope.go:117] "RemoveContainer" containerID="69bdc91950a0e5084c2ebf2f0aa1c0510b203a99022547551b46fa8bd179c8cf" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.306185 4841 scope.go:117] "RemoveContainer" containerID="a54a2c1acc61b139a5609b812062a68a65ca2006cb387ca9702c6d1c9568f9b8" Mar 13 09:26:59 crc kubenswrapper[4841]: E0313 09:26:59.306680 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a54a2c1acc61b139a5609b812062a68a65ca2006cb387ca9702c6d1c9568f9b8\": container with ID starting with a54a2c1acc61b139a5609b812062a68a65ca2006cb387ca9702c6d1c9568f9b8 not found: ID does not exist" containerID="a54a2c1acc61b139a5609b812062a68a65ca2006cb387ca9702c6d1c9568f9b8" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.306733 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a54a2c1acc61b139a5609b812062a68a65ca2006cb387ca9702c6d1c9568f9b8"} err="failed to get container status \"a54a2c1acc61b139a5609b812062a68a65ca2006cb387ca9702c6d1c9568f9b8\": rpc error: code = NotFound desc = could not find container \"a54a2c1acc61b139a5609b812062a68a65ca2006cb387ca9702c6d1c9568f9b8\": container with ID starting with a54a2c1acc61b139a5609b812062a68a65ca2006cb387ca9702c6d1c9568f9b8 not found: ID does not exist" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.306765 4841 scope.go:117] "RemoveContainer" containerID="b8ca1d383ec30142318ba8c6b7e70cf3c4d9d39496f99bf22618a4e48cbfa6eb" Mar 13 09:26:59 crc kubenswrapper[4841]: E0313 09:26:59.308126 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8ca1d383ec30142318ba8c6b7e70cf3c4d9d39496f99bf22618a4e48cbfa6eb\": container with ID starting with b8ca1d383ec30142318ba8c6b7e70cf3c4d9d39496f99bf22618a4e48cbfa6eb not found: ID does not exist" containerID="b8ca1d383ec30142318ba8c6b7e70cf3c4d9d39496f99bf22618a4e48cbfa6eb" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.308178 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8ca1d383ec30142318ba8c6b7e70cf3c4d9d39496f99bf22618a4e48cbfa6eb"} err="failed to get container status \"b8ca1d383ec30142318ba8c6b7e70cf3c4d9d39496f99bf22618a4e48cbfa6eb\": rpc error: code = NotFound desc = could not find container \"b8ca1d383ec30142318ba8c6b7e70cf3c4d9d39496f99bf22618a4e48cbfa6eb\": container with ID starting with b8ca1d383ec30142318ba8c6b7e70cf3c4d9d39496f99bf22618a4e48cbfa6eb not found: ID does not exist" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.308206 4841 scope.go:117] "RemoveContainer" containerID="69bdc91950a0e5084c2ebf2f0aa1c0510b203a99022547551b46fa8bd179c8cf" Mar 13 09:26:59 crc kubenswrapper[4841]: E0313 09:26:59.308680 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69bdc91950a0e5084c2ebf2f0aa1c0510b203a99022547551b46fa8bd179c8cf\": container with ID starting with 69bdc91950a0e5084c2ebf2f0aa1c0510b203a99022547551b46fa8bd179c8cf not found: ID does not exist" containerID="69bdc91950a0e5084c2ebf2f0aa1c0510b203a99022547551b46fa8bd179c8cf" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.308713 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69bdc91950a0e5084c2ebf2f0aa1c0510b203a99022547551b46fa8bd179c8cf"} err="failed to get container status \"69bdc91950a0e5084c2ebf2f0aa1c0510b203a99022547551b46fa8bd179c8cf\": rpc error: code = NotFound desc = could not find container \"69bdc91950a0e5084c2ebf2f0aa1c0510b203a99022547551b46fa8bd179c8cf\": container with ID starting with 69bdc91950a0e5084c2ebf2f0aa1c0510b203a99022547551b46fa8bd179c8cf not found: ID does not exist" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.489637 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ftrck"] Mar 13 09:26:59 crc kubenswrapper[4841]: E0313 09:26:59.490013 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ce3356-a536-4896-91db-35ccc6f131e9" containerName="extract-content" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.490036 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ce3356-a536-4896-91db-35ccc6f131e9" containerName="extract-content" Mar 13 09:26:59 crc kubenswrapper[4841]: E0313 09:26:59.490063 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9cbbd3-aa3e-470f-a172-e746431bd273" containerName="extract-content" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.490076 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9cbbd3-aa3e-470f-a172-e746431bd273" containerName="extract-content" Mar 13 09:26:59 crc kubenswrapper[4841]: E0313 09:26:59.490096 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7571cae-3083-455f-869a-f93947c7fb2c" containerName="registry-server" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.490110 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7571cae-3083-455f-869a-f93947c7fb2c" containerName="registry-server" Mar 13 09:26:59 crc kubenswrapper[4841]: E0313 09:26:59.490132 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9cbbd3-aa3e-470f-a172-e746431bd273" containerName="registry-server" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.490145 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9cbbd3-aa3e-470f-a172-e746431bd273" containerName="registry-server" Mar 13 09:26:59 crc kubenswrapper[4841]: E0313 09:26:59.490163 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9cbbd3-aa3e-470f-a172-e746431bd273" containerName="extract-utilities" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.490177 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9cbbd3-aa3e-470f-a172-e746431bd273" containerName="extract-utilities" Mar 13 09:26:59 crc kubenswrapper[4841]: E0313 09:26:59.490193 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7571cae-3083-455f-869a-f93947c7fb2c" containerName="extract-utilities" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.490206 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7571cae-3083-455f-869a-f93947c7fb2c" containerName="extract-utilities" Mar 13 09:26:59 crc kubenswrapper[4841]: E0313 09:26:59.490228 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ce3356-a536-4896-91db-35ccc6f131e9" containerName="extract-utilities" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.490241 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ce3356-a536-4896-91db-35ccc6f131e9" containerName="extract-utilities" Mar 13 09:26:59 crc kubenswrapper[4841]: E0313 09:26:59.490256 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ce3356-a536-4896-91db-35ccc6f131e9" containerName="registry-server" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.490276 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ce3356-a536-4896-91db-35ccc6f131e9" containerName="registry-server" Mar 13 09:26:59 crc kubenswrapper[4841]: E0313 09:26:59.490329 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7571cae-3083-455f-869a-f93947c7fb2c" containerName="extract-content" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.490342 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7571cae-3083-455f-869a-f93947c7fb2c" containerName="extract-content" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.490537 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d9cbbd3-aa3e-470f-a172-e746431bd273" containerName="registry-server" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.490565 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="48ce3356-a536-4896-91db-35ccc6f131e9" containerName="registry-server" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.490591 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7571cae-3083-455f-869a-f93947c7fb2c" containerName="registry-server" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.492082 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ftrck" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.518413 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ftrck"] Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.602919 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5696fe2-46f1-4567-8f93-e482aee3120e-catalog-content\") pod \"community-operators-ftrck\" (UID: \"b5696fe2-46f1-4567-8f93-e482aee3120e\") " pod="openshift-marketplace/community-operators-ftrck" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.603010 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5696fe2-46f1-4567-8f93-e482aee3120e-utilities\") pod \"community-operators-ftrck\" (UID: \"b5696fe2-46f1-4567-8f93-e482aee3120e\") " pod="openshift-marketplace/community-operators-ftrck" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.603214 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrh52\" (UniqueName: \"kubernetes.io/projected/b5696fe2-46f1-4567-8f93-e482aee3120e-kube-api-access-qrh52\") pod \"community-operators-ftrck\" (UID: \"b5696fe2-46f1-4567-8f93-e482aee3120e\") " pod="openshift-marketplace/community-operators-ftrck" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.704557 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrh52\" (UniqueName: \"kubernetes.io/projected/b5696fe2-46f1-4567-8f93-e482aee3120e-kube-api-access-qrh52\") pod \"community-operators-ftrck\" (UID: \"b5696fe2-46f1-4567-8f93-e482aee3120e\") " pod="openshift-marketplace/community-operators-ftrck" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.704951 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5696fe2-46f1-4567-8f93-e482aee3120e-catalog-content\") pod \"community-operators-ftrck\" (UID: \"b5696fe2-46f1-4567-8f93-e482aee3120e\") " pod="openshift-marketplace/community-operators-ftrck" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.705761 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5696fe2-46f1-4567-8f93-e482aee3120e-catalog-content\") pod \"community-operators-ftrck\" (UID: \"b5696fe2-46f1-4567-8f93-e482aee3120e\") " pod="openshift-marketplace/community-operators-ftrck" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.706097 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5696fe2-46f1-4567-8f93-e482aee3120e-utilities\") pod \"community-operators-ftrck\" (UID: \"b5696fe2-46f1-4567-8f93-e482aee3120e\") " pod="openshift-marketplace/community-operators-ftrck" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.705846 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5696fe2-46f1-4567-8f93-e482aee3120e-utilities\") pod \"community-operators-ftrck\" (UID: \"b5696fe2-46f1-4567-8f93-e482aee3120e\") " pod="openshift-marketplace/community-operators-ftrck" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.720212 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrh52\" (UniqueName: \"kubernetes.io/projected/b5696fe2-46f1-4567-8f93-e482aee3120e-kube-api-access-qrh52\") pod \"community-operators-ftrck\" (UID: \"b5696fe2-46f1-4567-8f93-e482aee3120e\") " pod="openshift-marketplace/community-operators-ftrck" Mar 13 09:26:59 crc kubenswrapper[4841]: I0313 09:26:59.823114 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ftrck" Mar 13 09:27:00 crc kubenswrapper[4841]: I0313 09:27:00.009610 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7571cae-3083-455f-869a-f93947c7fb2c" path="/var/lib/kubelet/pods/f7571cae-3083-455f-869a-f93947c7fb2c/volumes" Mar 13 09:27:00 crc kubenswrapper[4841]: I0313 09:27:00.108250 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ftrck"] Mar 13 09:27:00 crc kubenswrapper[4841]: I0313 09:27:00.231274 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftrck" event={"ID":"b5696fe2-46f1-4567-8f93-e482aee3120e","Type":"ContainerStarted","Data":"b378f768bac31b16fdcb5df1ec3afa70903c38577908edfa52cc3ea2146e1f42"} Mar 13 09:27:01 crc kubenswrapper[4841]: I0313 09:27:01.241476 4841 generic.go:334] "Generic (PLEG): container finished" podID="b5696fe2-46f1-4567-8f93-e482aee3120e" containerID="93eb91d95a841e9ce759297332c9c7e647eb6cd0b22dd6bf889ca2ad8963dc39" exitCode=0 Mar 13 09:27:01 crc kubenswrapper[4841]: I0313 09:27:01.241537 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftrck" event={"ID":"b5696fe2-46f1-4567-8f93-e482aee3120e","Type":"ContainerDied","Data":"93eb91d95a841e9ce759297332c9c7e647eb6cd0b22dd6bf889ca2ad8963dc39"} Mar 13 09:27:01 crc kubenswrapper[4841]: I0313 09:27:01.243896 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 09:27:02 crc kubenswrapper[4841]: I0313 09:27:02.250021 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftrck" event={"ID":"b5696fe2-46f1-4567-8f93-e482aee3120e","Type":"ContainerStarted","Data":"1eebcc4f6a9eb32ab0cbbf9fd5de1b2eb8fdf826724f19f05018983eb76f68b0"} Mar 13 09:27:03 crc kubenswrapper[4841]: I0313 09:27:03.259624 4841 generic.go:334] "Generic (PLEG): container finished" podID="b5696fe2-46f1-4567-8f93-e482aee3120e" containerID="1eebcc4f6a9eb32ab0cbbf9fd5de1b2eb8fdf826724f19f05018983eb76f68b0" exitCode=0 Mar 13 09:27:03 crc kubenswrapper[4841]: I0313 09:27:03.259673 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftrck" event={"ID":"b5696fe2-46f1-4567-8f93-e482aee3120e","Type":"ContainerDied","Data":"1eebcc4f6a9eb32ab0cbbf9fd5de1b2eb8fdf826724f19f05018983eb76f68b0"} Mar 13 09:27:04 crc kubenswrapper[4841]: I0313 09:27:04.267416 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftrck" event={"ID":"b5696fe2-46f1-4567-8f93-e482aee3120e","Type":"ContainerStarted","Data":"be57bd047fa73dd64dd964e48982ecff28379f81a36a6d71b021013bf62cf8af"} Mar 13 09:27:04 crc kubenswrapper[4841]: I0313 09:27:04.292667 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ftrck" podStartSLOduration=2.798491277 podStartE2EDuration="5.29264908s" podCreationTimestamp="2026-03-13 09:26:59 +0000 UTC" firstStartedPulling="2026-03-13 09:27:01.243580723 +0000 UTC m=+903.973480924" lastFinishedPulling="2026-03-13 09:27:03.737738536 +0000 UTC m=+906.467638727" observedRunningTime="2026-03-13 09:27:04.28879868 +0000 UTC m=+907.018698881" watchObservedRunningTime="2026-03-13 09:27:04.29264908 +0000 UTC m=+907.022549281" Mar 13 09:27:09 crc kubenswrapper[4841]: I0313 09:27:09.824170 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ftrck" Mar 13 09:27:09 crc kubenswrapper[4841]: I0313 09:27:09.825119 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ftrck" Mar 13 09:27:09 crc kubenswrapper[4841]: I0313 09:27:09.895988 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ftrck" Mar 13 09:27:10 crc kubenswrapper[4841]: I0313 09:27:10.401669 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ftrck" Mar 13 09:27:12 crc kubenswrapper[4841]: I0313 09:27:12.271945 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ftrck"] Mar 13 09:27:12 crc kubenswrapper[4841]: I0313 09:27:12.318757 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ftrck" podUID="b5696fe2-46f1-4567-8f93-e482aee3120e" containerName="registry-server" containerID="cri-o://be57bd047fa73dd64dd964e48982ecff28379f81a36a6d71b021013bf62cf8af" gracePeriod=2 Mar 13 09:27:12 crc kubenswrapper[4841]: I0313 09:27:12.750340 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ftrck" Mar 13 09:27:12 crc kubenswrapper[4841]: I0313 09:27:12.814096 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrh52\" (UniqueName: \"kubernetes.io/projected/b5696fe2-46f1-4567-8f93-e482aee3120e-kube-api-access-qrh52\") pod \"b5696fe2-46f1-4567-8f93-e482aee3120e\" (UID: \"b5696fe2-46f1-4567-8f93-e482aee3120e\") " Mar 13 09:27:12 crc kubenswrapper[4841]: I0313 09:27:12.814220 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5696fe2-46f1-4567-8f93-e482aee3120e-utilities\") pod \"b5696fe2-46f1-4567-8f93-e482aee3120e\" (UID: \"b5696fe2-46f1-4567-8f93-e482aee3120e\") " Mar 13 09:27:12 crc kubenswrapper[4841]: I0313 09:27:12.814246 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5696fe2-46f1-4567-8f93-e482aee3120e-catalog-content\") pod \"b5696fe2-46f1-4567-8f93-e482aee3120e\" (UID: \"b5696fe2-46f1-4567-8f93-e482aee3120e\") " Mar 13 09:27:12 crc kubenswrapper[4841]: I0313 09:27:12.815572 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5696fe2-46f1-4567-8f93-e482aee3120e-utilities" (OuterVolumeSpecName: "utilities") pod "b5696fe2-46f1-4567-8f93-e482aee3120e" (UID: "b5696fe2-46f1-4567-8f93-e482aee3120e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:27:12 crc kubenswrapper[4841]: I0313 09:27:12.823461 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5696fe2-46f1-4567-8f93-e482aee3120e-kube-api-access-qrh52" (OuterVolumeSpecName: "kube-api-access-qrh52") pod "b5696fe2-46f1-4567-8f93-e482aee3120e" (UID: "b5696fe2-46f1-4567-8f93-e482aee3120e"). InnerVolumeSpecName "kube-api-access-qrh52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:27:12 crc kubenswrapper[4841]: I0313 09:27:12.865935 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5696fe2-46f1-4567-8f93-e482aee3120e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5696fe2-46f1-4567-8f93-e482aee3120e" (UID: "b5696fe2-46f1-4567-8f93-e482aee3120e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:27:12 crc kubenswrapper[4841]: I0313 09:27:12.915400 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrh52\" (UniqueName: \"kubernetes.io/projected/b5696fe2-46f1-4567-8f93-e482aee3120e-kube-api-access-qrh52\") on node \"crc\" DevicePath \"\"" Mar 13 09:27:12 crc kubenswrapper[4841]: I0313 09:27:12.915433 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5696fe2-46f1-4567-8f93-e482aee3120e-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:27:12 crc kubenswrapper[4841]: I0313 09:27:12.915447 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5696fe2-46f1-4567-8f93-e482aee3120e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:27:13 crc kubenswrapper[4841]: I0313 09:27:13.328974 4841 generic.go:334] "Generic (PLEG): container finished" podID="b5696fe2-46f1-4567-8f93-e482aee3120e" containerID="be57bd047fa73dd64dd964e48982ecff28379f81a36a6d71b021013bf62cf8af" exitCode=0 Mar 13 09:27:13 crc kubenswrapper[4841]: I0313 09:27:13.329014 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ftrck" Mar 13 09:27:13 crc kubenswrapper[4841]: I0313 09:27:13.329036 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftrck" event={"ID":"b5696fe2-46f1-4567-8f93-e482aee3120e","Type":"ContainerDied","Data":"be57bd047fa73dd64dd964e48982ecff28379f81a36a6d71b021013bf62cf8af"} Mar 13 09:27:13 crc kubenswrapper[4841]: I0313 09:27:13.329119 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftrck" event={"ID":"b5696fe2-46f1-4567-8f93-e482aee3120e","Type":"ContainerDied","Data":"b378f768bac31b16fdcb5df1ec3afa70903c38577908edfa52cc3ea2146e1f42"} Mar 13 09:27:13 crc kubenswrapper[4841]: I0313 09:27:13.329173 4841 scope.go:117] "RemoveContainer" containerID="be57bd047fa73dd64dd964e48982ecff28379f81a36a6d71b021013bf62cf8af" Mar 13 09:27:13 crc kubenswrapper[4841]: I0313 09:27:13.366014 4841 scope.go:117] "RemoveContainer" containerID="1eebcc4f6a9eb32ab0cbbf9fd5de1b2eb8fdf826724f19f05018983eb76f68b0" Mar 13 09:27:13 crc kubenswrapper[4841]: I0313 09:27:13.366242 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ftrck"] Mar 13 09:27:13 crc kubenswrapper[4841]: I0313 09:27:13.372550 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ftrck"] Mar 13 09:27:13 crc kubenswrapper[4841]: I0313 09:27:13.393110 4841 scope.go:117] "RemoveContainer" containerID="93eb91d95a841e9ce759297332c9c7e647eb6cd0b22dd6bf889ca2ad8963dc39" Mar 13 09:27:13 crc kubenswrapper[4841]: I0313 09:27:13.418806 4841 scope.go:117] "RemoveContainer" containerID="be57bd047fa73dd64dd964e48982ecff28379f81a36a6d71b021013bf62cf8af" Mar 13 09:27:13 crc kubenswrapper[4841]: E0313 09:27:13.419296 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be57bd047fa73dd64dd964e48982ecff28379f81a36a6d71b021013bf62cf8af\": container with ID starting with be57bd047fa73dd64dd964e48982ecff28379f81a36a6d71b021013bf62cf8af not found: ID does not exist" containerID="be57bd047fa73dd64dd964e48982ecff28379f81a36a6d71b021013bf62cf8af" Mar 13 09:27:13 crc kubenswrapper[4841]: I0313 09:27:13.419357 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be57bd047fa73dd64dd964e48982ecff28379f81a36a6d71b021013bf62cf8af"} err="failed to get container status \"be57bd047fa73dd64dd964e48982ecff28379f81a36a6d71b021013bf62cf8af\": rpc error: code = NotFound desc = could not find container \"be57bd047fa73dd64dd964e48982ecff28379f81a36a6d71b021013bf62cf8af\": container with ID starting with be57bd047fa73dd64dd964e48982ecff28379f81a36a6d71b021013bf62cf8af not found: ID does not exist" Mar 13 09:27:13 crc kubenswrapper[4841]: I0313 09:27:13.419396 4841 scope.go:117] "RemoveContainer" containerID="1eebcc4f6a9eb32ab0cbbf9fd5de1b2eb8fdf826724f19f05018983eb76f68b0" Mar 13 09:27:13 crc kubenswrapper[4841]: E0313 09:27:13.419965 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eebcc4f6a9eb32ab0cbbf9fd5de1b2eb8fdf826724f19f05018983eb76f68b0\": container with ID starting with 1eebcc4f6a9eb32ab0cbbf9fd5de1b2eb8fdf826724f19f05018983eb76f68b0 not found: ID does not exist" containerID="1eebcc4f6a9eb32ab0cbbf9fd5de1b2eb8fdf826724f19f05018983eb76f68b0" Mar 13 09:27:13 crc kubenswrapper[4841]: I0313 09:27:13.420002 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eebcc4f6a9eb32ab0cbbf9fd5de1b2eb8fdf826724f19f05018983eb76f68b0"} err="failed to get container status \"1eebcc4f6a9eb32ab0cbbf9fd5de1b2eb8fdf826724f19f05018983eb76f68b0\": rpc error: code = NotFound desc = could not find container \"1eebcc4f6a9eb32ab0cbbf9fd5de1b2eb8fdf826724f19f05018983eb76f68b0\": container with ID starting with 1eebcc4f6a9eb32ab0cbbf9fd5de1b2eb8fdf826724f19f05018983eb76f68b0 not found: ID does not exist" Mar 13 09:27:13 crc kubenswrapper[4841]: I0313 09:27:13.420026 4841 scope.go:117] "RemoveContainer" containerID="93eb91d95a841e9ce759297332c9c7e647eb6cd0b22dd6bf889ca2ad8963dc39" Mar 13 09:27:13 crc kubenswrapper[4841]: E0313 09:27:13.420439 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93eb91d95a841e9ce759297332c9c7e647eb6cd0b22dd6bf889ca2ad8963dc39\": container with ID starting with 93eb91d95a841e9ce759297332c9c7e647eb6cd0b22dd6bf889ca2ad8963dc39 not found: ID does not exist" containerID="93eb91d95a841e9ce759297332c9c7e647eb6cd0b22dd6bf889ca2ad8963dc39" Mar 13 09:27:13 crc kubenswrapper[4841]: I0313 09:27:13.420485 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93eb91d95a841e9ce759297332c9c7e647eb6cd0b22dd6bf889ca2ad8963dc39"} err="failed to get container status \"93eb91d95a841e9ce759297332c9c7e647eb6cd0b22dd6bf889ca2ad8963dc39\": rpc error: code = NotFound desc = could not find container \"93eb91d95a841e9ce759297332c9c7e647eb6cd0b22dd6bf889ca2ad8963dc39\": container with ID starting with 93eb91d95a841e9ce759297332c9c7e647eb6cd0b22dd6bf889ca2ad8963dc39 not found: ID does not exist" Mar 13 09:27:14 crc kubenswrapper[4841]: I0313 09:27:14.006414 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5696fe2-46f1-4567-8f93-e482aee3120e" path="/var/lib/kubelet/pods/b5696fe2-46f1-4567-8f93-e482aee3120e/volumes" Mar 13 09:27:18 crc kubenswrapper[4841]: I0313 09:27:18.945597 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6d6c4d5946-gtbzk" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.573340 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-lpxn6"] Mar 13 09:27:19 crc kubenswrapper[4841]: E0313 09:27:19.573837 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5696fe2-46f1-4567-8f93-e482aee3120e" containerName="extract-content" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.573850 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5696fe2-46f1-4567-8f93-e482aee3120e" containerName="extract-content" Mar 13 09:27:19 crc kubenswrapper[4841]: E0313 09:27:19.573870 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5696fe2-46f1-4567-8f93-e482aee3120e" containerName="extract-utilities" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.573876 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5696fe2-46f1-4567-8f93-e482aee3120e" containerName="extract-utilities" Mar 13 09:27:19 crc kubenswrapper[4841]: E0313 09:27:19.573886 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5696fe2-46f1-4567-8f93-e482aee3120e" containerName="registry-server" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.573893 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5696fe2-46f1-4567-8f93-e482aee3120e" containerName="registry-server" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.573982 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5696fe2-46f1-4567-8f93-e482aee3120e" containerName="registry-server" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.576031 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.577819 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-pdn6d" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.577864 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.578086 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.590169 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-qnhkg"] Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.591307 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qnhkg" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.593627 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.601371 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8a43b722-1514-4a29-8935-2f1444488222-frr-conf\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.601652 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8a43b722-1514-4a29-8935-2f1444488222-frr-sockets\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.601760 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8a43b722-1514-4a29-8935-2f1444488222-metrics\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.601842 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a43b722-1514-4a29-8935-2f1444488222-metrics-certs\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.601870 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8a43b722-1514-4a29-8935-2f1444488222-reloader\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.601988 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8a43b722-1514-4a29-8935-2f1444488222-frr-startup\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.602017 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a96f06d-396f-44a0-a357-f8b615676b3f-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qnhkg\" (UID: \"1a96f06d-396f-44a0-a357-f8b615676b3f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qnhkg" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.602050 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx5xz\" (UniqueName: \"kubernetes.io/projected/8a43b722-1514-4a29-8935-2f1444488222-kube-api-access-vx5xz\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.602092 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m7xx\" (UniqueName: \"kubernetes.io/projected/1a96f06d-396f-44a0-a357-f8b615676b3f-kube-api-access-5m7xx\") pod \"frr-k8s-webhook-server-bcc4b6f68-qnhkg\" (UID: \"1a96f06d-396f-44a0-a357-f8b615676b3f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qnhkg" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.606793 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-qnhkg"] Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.668652 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-lpmg6"] Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.669744 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-lpmg6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.672141 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.672750 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.673424 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-8r68w" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.675533 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.679773 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-6x8lt"] Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.680662 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-6x8lt" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.682227 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.702096 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-6x8lt"] Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.703020 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m7xx\" (UniqueName: \"kubernetes.io/projected/1a96f06d-396f-44a0-a357-f8b615676b3f-kube-api-access-5m7xx\") pod \"frr-k8s-webhook-server-bcc4b6f68-qnhkg\" (UID: \"1a96f06d-396f-44a0-a357-f8b615676b3f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qnhkg" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.703079 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8a43b722-1514-4a29-8935-2f1444488222-frr-conf\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.703098 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8a43b722-1514-4a29-8935-2f1444488222-frr-sockets\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.703117 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8a43b722-1514-4a29-8935-2f1444488222-metrics\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.703139 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/62891ab6-67e5-4c9e-83b6-aec814f74ca6-metallb-excludel2\") pod \"speaker-lpmg6\" (UID: \"62891ab6-67e5-4c9e-83b6-aec814f74ca6\") " pod="metallb-system/speaker-lpmg6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.703156 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g522l\" (UniqueName: \"kubernetes.io/projected/62891ab6-67e5-4c9e-83b6-aec814f74ca6-kube-api-access-g522l\") pod \"speaker-lpmg6\" (UID: \"62891ab6-67e5-4c9e-83b6-aec814f74ca6\") " pod="metallb-system/speaker-lpmg6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.703174 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a43b722-1514-4a29-8935-2f1444488222-metrics-certs\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.703208 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8a43b722-1514-4a29-8935-2f1444488222-reloader\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.703234 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8a43b722-1514-4a29-8935-2f1444488222-frr-startup\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.703250 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/62891ab6-67e5-4c9e-83b6-aec814f74ca6-memberlist\") pod \"speaker-lpmg6\" (UID: \"62891ab6-67e5-4c9e-83b6-aec814f74ca6\") " pod="metallb-system/speaker-lpmg6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.703278 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69db2e0c-e892-4c3c-909b-3f7ba4d650bb-metrics-certs\") pod \"controller-7bb4cc7c98-6x8lt\" (UID: \"69db2e0c-e892-4c3c-909b-3f7ba4d650bb\") " pod="metallb-system/controller-7bb4cc7c98-6x8lt" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.703295 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a96f06d-396f-44a0-a357-f8b615676b3f-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qnhkg\" (UID: \"1a96f06d-396f-44a0-a357-f8b615676b3f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qnhkg" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.703312 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/69db2e0c-e892-4c3c-909b-3f7ba4d650bb-cert\") pod \"controller-7bb4cc7c98-6x8lt\" (UID: \"69db2e0c-e892-4c3c-909b-3f7ba4d650bb\") " pod="metallb-system/controller-7bb4cc7c98-6x8lt" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.703329 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjx7w\" (UniqueName: \"kubernetes.io/projected/69db2e0c-e892-4c3c-909b-3f7ba4d650bb-kube-api-access-pjx7w\") pod \"controller-7bb4cc7c98-6x8lt\" (UID: \"69db2e0c-e892-4c3c-909b-3f7ba4d650bb\") " pod="metallb-system/controller-7bb4cc7c98-6x8lt" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.703356 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx5xz\" (UniqueName: \"kubernetes.io/projected/8a43b722-1514-4a29-8935-2f1444488222-kube-api-access-vx5xz\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.703378 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62891ab6-67e5-4c9e-83b6-aec814f74ca6-metrics-certs\") pod \"speaker-lpmg6\" (UID: \"62891ab6-67e5-4c9e-83b6-aec814f74ca6\") " pod="metallb-system/speaker-lpmg6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.704031 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8a43b722-1514-4a29-8935-2f1444488222-frr-conf\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: E0313 09:27:19.704122 4841 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 13 09:27:19 crc kubenswrapper[4841]: E0313 09:27:19.704164 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a96f06d-396f-44a0-a357-f8b615676b3f-cert podName:1a96f06d-396f-44a0-a357-f8b615676b3f nodeName:}" failed. No retries permitted until 2026-03-13 09:27:20.204148721 +0000 UTC m=+922.934048922 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1a96f06d-396f-44a0-a357-f8b615676b3f-cert") pod "frr-k8s-webhook-server-bcc4b6f68-qnhkg" (UID: "1a96f06d-396f-44a0-a357-f8b615676b3f") : secret "frr-k8s-webhook-server-cert" not found Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.704548 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8a43b722-1514-4a29-8935-2f1444488222-frr-sockets\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.704758 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8a43b722-1514-4a29-8935-2f1444488222-metrics\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.704823 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8a43b722-1514-4a29-8935-2f1444488222-reloader\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.705424 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8a43b722-1514-4a29-8935-2f1444488222-frr-startup\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.711973 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a43b722-1514-4a29-8935-2f1444488222-metrics-certs\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.738530 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m7xx\" (UniqueName: \"kubernetes.io/projected/1a96f06d-396f-44a0-a357-f8b615676b3f-kube-api-access-5m7xx\") pod \"frr-k8s-webhook-server-bcc4b6f68-qnhkg\" (UID: \"1a96f06d-396f-44a0-a357-f8b615676b3f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qnhkg" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.741828 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx5xz\" (UniqueName: \"kubernetes.io/projected/8a43b722-1514-4a29-8935-2f1444488222-kube-api-access-vx5xz\") pod \"frr-k8s-lpxn6\" (UID: \"8a43b722-1514-4a29-8935-2f1444488222\") " pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.804768 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/62891ab6-67e5-4c9e-83b6-aec814f74ca6-memberlist\") pod \"speaker-lpmg6\" (UID: \"62891ab6-67e5-4c9e-83b6-aec814f74ca6\") " pod="metallb-system/speaker-lpmg6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.804811 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69db2e0c-e892-4c3c-909b-3f7ba4d650bb-metrics-certs\") pod \"controller-7bb4cc7c98-6x8lt\" (UID: \"69db2e0c-e892-4c3c-909b-3f7ba4d650bb\") " pod="metallb-system/controller-7bb4cc7c98-6x8lt" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.804848 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/69db2e0c-e892-4c3c-909b-3f7ba4d650bb-cert\") pod \"controller-7bb4cc7c98-6x8lt\" (UID: \"69db2e0c-e892-4c3c-909b-3f7ba4d650bb\") " pod="metallb-system/controller-7bb4cc7c98-6x8lt" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.804877 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjx7w\" (UniqueName: \"kubernetes.io/projected/69db2e0c-e892-4c3c-909b-3f7ba4d650bb-kube-api-access-pjx7w\") pod \"controller-7bb4cc7c98-6x8lt\" (UID: \"69db2e0c-e892-4c3c-909b-3f7ba4d650bb\") " pod="metallb-system/controller-7bb4cc7c98-6x8lt" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.804941 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62891ab6-67e5-4c9e-83b6-aec814f74ca6-metrics-certs\") pod \"speaker-lpmg6\" (UID: \"62891ab6-67e5-4c9e-83b6-aec814f74ca6\") " pod="metallb-system/speaker-lpmg6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.805014 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/62891ab6-67e5-4c9e-83b6-aec814f74ca6-metallb-excludel2\") pod \"speaker-lpmg6\" (UID: \"62891ab6-67e5-4c9e-83b6-aec814f74ca6\") " pod="metallb-system/speaker-lpmg6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.805428 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g522l\" (UniqueName: \"kubernetes.io/projected/62891ab6-67e5-4c9e-83b6-aec814f74ca6-kube-api-access-g522l\") pod \"speaker-lpmg6\" (UID: \"62891ab6-67e5-4c9e-83b6-aec814f74ca6\") " pod="metallb-system/speaker-lpmg6" Mar 13 09:27:19 crc kubenswrapper[4841]: E0313 09:27:19.805257 4841 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 09:27:19 crc kubenswrapper[4841]: E0313 09:27:19.805549 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62891ab6-67e5-4c9e-83b6-aec814f74ca6-memberlist podName:62891ab6-67e5-4c9e-83b6-aec814f74ca6 nodeName:}" failed. No retries permitted until 2026-03-13 09:27:20.305530644 +0000 UTC m=+923.035430825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/62891ab6-67e5-4c9e-83b6-aec814f74ca6-memberlist") pod "speaker-lpmg6" (UID: "62891ab6-67e5-4c9e-83b6-aec814f74ca6") : secret "metallb-memberlist" not found Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.805823 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/62891ab6-67e5-4c9e-83b6-aec814f74ca6-metallb-excludel2\") pod \"speaker-lpmg6\" (UID: \"62891ab6-67e5-4c9e-83b6-aec814f74ca6\") " pod="metallb-system/speaker-lpmg6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.808345 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.808736 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62891ab6-67e5-4c9e-83b6-aec814f74ca6-metrics-certs\") pod \"speaker-lpmg6\" (UID: \"62891ab6-67e5-4c9e-83b6-aec814f74ca6\") " pod="metallb-system/speaker-lpmg6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.811106 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69db2e0c-e892-4c3c-909b-3f7ba4d650bb-metrics-certs\") pod \"controller-7bb4cc7c98-6x8lt\" (UID: \"69db2e0c-e892-4c3c-909b-3f7ba4d650bb\") " pod="metallb-system/controller-7bb4cc7c98-6x8lt" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.821526 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g522l\" (UniqueName: \"kubernetes.io/projected/62891ab6-67e5-4c9e-83b6-aec814f74ca6-kube-api-access-g522l\") pod \"speaker-lpmg6\" (UID: \"62891ab6-67e5-4c9e-83b6-aec814f74ca6\") " pod="metallb-system/speaker-lpmg6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.821616 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/69db2e0c-e892-4c3c-909b-3f7ba4d650bb-cert\") pod \"controller-7bb4cc7c98-6x8lt\" (UID: \"69db2e0c-e892-4c3c-909b-3f7ba4d650bb\") " pod="metallb-system/controller-7bb4cc7c98-6x8lt" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.824177 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjx7w\" (UniqueName: \"kubernetes.io/projected/69db2e0c-e892-4c3c-909b-3f7ba4d650bb-kube-api-access-pjx7w\") pod \"controller-7bb4cc7c98-6x8lt\" (UID: \"69db2e0c-e892-4c3c-909b-3f7ba4d650bb\") " pod="metallb-system/controller-7bb4cc7c98-6x8lt" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.896095 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:19 crc kubenswrapper[4841]: I0313 09:27:19.992360 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-6x8lt" Mar 13 09:27:20 crc kubenswrapper[4841]: I0313 09:27:20.208600 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-6x8lt"] Mar 13 09:27:20 crc kubenswrapper[4841]: I0313 09:27:20.210773 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a96f06d-396f-44a0-a357-f8b615676b3f-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qnhkg\" (UID: \"1a96f06d-396f-44a0-a357-f8b615676b3f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qnhkg" Mar 13 09:27:20 crc kubenswrapper[4841]: W0313 09:27:20.214453 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69db2e0c_e892_4c3c_909b_3f7ba4d650bb.slice/crio-e65e118eaad8b31447b773fc2ae742e3c07a27da44a8fcf733acd0a8c6e07361 WatchSource:0}: Error finding container e65e118eaad8b31447b773fc2ae742e3c07a27da44a8fcf733acd0a8c6e07361: Status 404 returned error can't find the container with id e65e118eaad8b31447b773fc2ae742e3c07a27da44a8fcf733acd0a8c6e07361 Mar 13 09:27:20 crc kubenswrapper[4841]: I0313 09:27:20.215306 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a96f06d-396f-44a0-a357-f8b615676b3f-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qnhkg\" (UID: \"1a96f06d-396f-44a0-a357-f8b615676b3f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qnhkg" Mar 13 09:27:20 crc kubenswrapper[4841]: I0313 09:27:20.311868 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/62891ab6-67e5-4c9e-83b6-aec814f74ca6-memberlist\") pod \"speaker-lpmg6\" (UID: \"62891ab6-67e5-4c9e-83b6-aec814f74ca6\") " pod="metallb-system/speaker-lpmg6" Mar 13 09:27:20 crc kubenswrapper[4841]: E0313 09:27:20.312085 4841 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 09:27:20 crc kubenswrapper[4841]: E0313 09:27:20.312135 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62891ab6-67e5-4c9e-83b6-aec814f74ca6-memberlist podName:62891ab6-67e5-4c9e-83b6-aec814f74ca6 nodeName:}" failed. No retries permitted until 2026-03-13 09:27:21.312120168 +0000 UTC m=+924.042020359 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/62891ab6-67e5-4c9e-83b6-aec814f74ca6-memberlist") pod "speaker-lpmg6" (UID: "62891ab6-67e5-4c9e-83b6-aec814f74ca6") : secret "metallb-memberlist" not found Mar 13 09:27:20 crc kubenswrapper[4841]: I0313 09:27:20.388880 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-6x8lt" event={"ID":"69db2e0c-e892-4c3c-909b-3f7ba4d650bb","Type":"ContainerStarted","Data":"68011c6b723cc8b33dd2c3a6659e065262170b09b9cb520f8cf68973fdffc007"} Mar 13 09:27:20 crc kubenswrapper[4841]: I0313 09:27:20.388926 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-6x8lt" event={"ID":"69db2e0c-e892-4c3c-909b-3f7ba4d650bb","Type":"ContainerStarted","Data":"e65e118eaad8b31447b773fc2ae742e3c07a27da44a8fcf733acd0a8c6e07361"} Mar 13 09:27:20 crc kubenswrapper[4841]: I0313 09:27:20.390428 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lpxn6" event={"ID":"8a43b722-1514-4a29-8935-2f1444488222","Type":"ContainerStarted","Data":"a6f7a071b69da80cf315e3957eb90c35887cf5c1a18752623658942dd7fd610d"} Mar 13 09:27:20 crc kubenswrapper[4841]: I0313 09:27:20.505901 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qnhkg" Mar 13 09:27:20 crc kubenswrapper[4841]: I0313 09:27:20.706370 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-qnhkg"] Mar 13 09:27:20 crc kubenswrapper[4841]: W0313 09:27:20.711652 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a96f06d_396f_44a0_a357_f8b615676b3f.slice/crio-44b12d788be133997018fc0e796da3b78d468b567b1d9b6675ca18292c0e54f6 WatchSource:0}: Error finding container 44b12d788be133997018fc0e796da3b78d468b567b1d9b6675ca18292c0e54f6: Status 404 returned error can't find the container with id 44b12d788be133997018fc0e796da3b78d468b567b1d9b6675ca18292c0e54f6 Mar 13 09:27:21 crc kubenswrapper[4841]: I0313 09:27:21.325067 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/62891ab6-67e5-4c9e-83b6-aec814f74ca6-memberlist\") pod \"speaker-lpmg6\" (UID: \"62891ab6-67e5-4c9e-83b6-aec814f74ca6\") " pod="metallb-system/speaker-lpmg6" Mar 13 09:27:21 crc kubenswrapper[4841]: I0313 09:27:21.333930 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/62891ab6-67e5-4c9e-83b6-aec814f74ca6-memberlist\") pod \"speaker-lpmg6\" (UID: \"62891ab6-67e5-4c9e-83b6-aec814f74ca6\") " pod="metallb-system/speaker-lpmg6" Mar 13 09:27:21 crc kubenswrapper[4841]: I0313 09:27:21.398892 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-6x8lt" event={"ID":"69db2e0c-e892-4c3c-909b-3f7ba4d650bb","Type":"ContainerStarted","Data":"2abf2378f86671901a1787649f4ef0dcee2a1f40fb9b1b17737fd1a600785246"} Mar 13 09:27:21 crc kubenswrapper[4841]: I0313 09:27:21.399642 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-6x8lt" Mar 13 09:27:21 crc kubenswrapper[4841]: I0313 09:27:21.399877 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qnhkg" event={"ID":"1a96f06d-396f-44a0-a357-f8b615676b3f","Type":"ContainerStarted","Data":"44b12d788be133997018fc0e796da3b78d468b567b1d9b6675ca18292c0e54f6"} Mar 13 09:27:21 crc kubenswrapper[4841]: I0313 09:27:21.419673 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-6x8lt" podStartSLOduration=2.419653132 podStartE2EDuration="2.419653132s" podCreationTimestamp="2026-03-13 09:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:27:21.415952506 +0000 UTC m=+924.145852697" watchObservedRunningTime="2026-03-13 09:27:21.419653132 +0000 UTC m=+924.149553323" Mar 13 09:27:21 crc kubenswrapper[4841]: I0313 09:27:21.482555 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-lpmg6" Mar 13 09:27:21 crc kubenswrapper[4841]: W0313 09:27:21.506164 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62891ab6_67e5_4c9e_83b6_aec814f74ca6.slice/crio-e5ed497f1501f040adaa0d2f51221f704fa2773d07009f16e3857e55de0371ef WatchSource:0}: Error finding container e5ed497f1501f040adaa0d2f51221f704fa2773d07009f16e3857e55de0371ef: Status 404 returned error can't find the container with id e5ed497f1501f040adaa0d2f51221f704fa2773d07009f16e3857e55de0371ef Mar 13 09:27:22 crc kubenswrapper[4841]: I0313 09:27:22.409724 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lpmg6" event={"ID":"62891ab6-67e5-4c9e-83b6-aec814f74ca6","Type":"ContainerStarted","Data":"5ca507e65f690b3b6c3b0574c59af7870f3092b7c98e64da599ebadb7fde409c"} Mar 13 09:27:22 crc kubenswrapper[4841]: I0313 09:27:22.410295 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lpmg6" event={"ID":"62891ab6-67e5-4c9e-83b6-aec814f74ca6","Type":"ContainerStarted","Data":"152eeb691ed5795af563621e6feff8ad6bf24ba9ca1bd1e0e5c2d0defffc8c30"} Mar 13 09:27:22 crc kubenswrapper[4841]: I0313 09:27:22.410317 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lpmg6" event={"ID":"62891ab6-67e5-4c9e-83b6-aec814f74ca6","Type":"ContainerStarted","Data":"e5ed497f1501f040adaa0d2f51221f704fa2773d07009f16e3857e55de0371ef"} Mar 13 09:27:22 crc kubenswrapper[4841]: I0313 09:27:22.410555 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-lpmg6" Mar 13 09:27:28 crc kubenswrapper[4841]: I0313 09:27:28.029837 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-lpmg6" podStartSLOduration=9.02981119 podStartE2EDuration="9.02981119s" podCreationTimestamp="2026-03-13 09:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:27:22.428743943 +0000 UTC m=+925.158644134" watchObservedRunningTime="2026-03-13 09:27:28.02981119 +0000 UTC m=+930.759711411" Mar 13 09:27:28 crc kubenswrapper[4841]: I0313 09:27:28.455999 4841 generic.go:334] "Generic (PLEG): container finished" podID="8a43b722-1514-4a29-8935-2f1444488222" containerID="c9f463d85f5add9a2e2e68d057dca1a3309e757c3a2827a5d32c7a0a8c45871f" exitCode=0 Mar 13 09:27:28 crc kubenswrapper[4841]: I0313 09:27:28.456081 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lpxn6" event={"ID":"8a43b722-1514-4a29-8935-2f1444488222","Type":"ContainerDied","Data":"c9f463d85f5add9a2e2e68d057dca1a3309e757c3a2827a5d32c7a0a8c45871f"} Mar 13 09:27:28 crc kubenswrapper[4841]: I0313 09:27:28.458946 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qnhkg" event={"ID":"1a96f06d-396f-44a0-a357-f8b615676b3f","Type":"ContainerStarted","Data":"c02a8036ef5c38839e4d19c7e6fcd4f081a53eae0f0dce1db3a77ee09405ee32"} Mar 13 09:27:28 crc kubenswrapper[4841]: I0313 09:27:28.459258 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qnhkg" Mar 13 09:27:28 crc kubenswrapper[4841]: I0313 09:27:28.528866 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qnhkg" podStartSLOduration=2.772066 podStartE2EDuration="9.528844219s" podCreationTimestamp="2026-03-13 09:27:19 +0000 UTC" firstStartedPulling="2026-03-13 09:27:20.715028098 +0000 UTC m=+923.444928299" lastFinishedPulling="2026-03-13 09:27:27.471806277 +0000 UTC m=+930.201706518" observedRunningTime="2026-03-13 09:27:28.518006402 +0000 UTC m=+931.247906593" watchObservedRunningTime="2026-03-13 09:27:28.528844219 +0000 UTC m=+931.258744420" Mar 13 09:27:29 crc kubenswrapper[4841]: I0313 09:27:29.468932 4841 generic.go:334] "Generic (PLEG): container finished" podID="8a43b722-1514-4a29-8935-2f1444488222" containerID="7e33dfd922aa14979bc868a9fefcaa1f3370777fe66f0651b4736e4462405371" exitCode=0 Mar 13 09:27:29 crc kubenswrapper[4841]: I0313 09:27:29.468992 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lpxn6" event={"ID":"8a43b722-1514-4a29-8935-2f1444488222","Type":"ContainerDied","Data":"7e33dfd922aa14979bc868a9fefcaa1f3370777fe66f0651b4736e4462405371"} Mar 13 09:27:30 crc kubenswrapper[4841]: I0313 09:27:30.479489 4841 generic.go:334] "Generic (PLEG): container finished" podID="8a43b722-1514-4a29-8935-2f1444488222" containerID="07f9ab0fdd7bcc5576b1651378cf9959ecef04a9ba1f5817abc4a466d8d714b4" exitCode=0 Mar 13 09:27:30 crc kubenswrapper[4841]: I0313 09:27:30.479531 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lpxn6" event={"ID":"8a43b722-1514-4a29-8935-2f1444488222","Type":"ContainerDied","Data":"07f9ab0fdd7bcc5576b1651378cf9959ecef04a9ba1f5817abc4a466d8d714b4"} Mar 13 09:27:31 crc kubenswrapper[4841]: I0313 09:27:31.486070 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-lpmg6" Mar 13 09:27:31 crc kubenswrapper[4841]: I0313 09:27:31.491195 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lpxn6" event={"ID":"8a43b722-1514-4a29-8935-2f1444488222","Type":"ContainerStarted","Data":"98f2221ebaabf7e4567e2d026543363aaaac93b88c380bd4d6692b94161bbbf5"} Mar 13 09:27:31 crc kubenswrapper[4841]: I0313 09:27:31.491225 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lpxn6" event={"ID":"8a43b722-1514-4a29-8935-2f1444488222","Type":"ContainerStarted","Data":"bba40f3b86f40dd112ed176a82231ea30c591104910c9f516f7b7c4ff8f09876"} Mar 13 09:27:31 crc kubenswrapper[4841]: I0313 09:27:31.491236 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lpxn6" event={"ID":"8a43b722-1514-4a29-8935-2f1444488222","Type":"ContainerStarted","Data":"455c1924e560bcb5374c43c46139a4368a74c4f687aec06bbeb246d658243abf"} Mar 13 09:27:31 crc kubenswrapper[4841]: I0313 09:27:31.491245 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lpxn6" event={"ID":"8a43b722-1514-4a29-8935-2f1444488222","Type":"ContainerStarted","Data":"ae6259c11a7f6127e8a2a927feaacf632ce5b6553d07ba8d116e8a8a9d929c44"} Mar 13 09:27:31 crc kubenswrapper[4841]: I0313 09:27:31.491254 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lpxn6" event={"ID":"8a43b722-1514-4a29-8935-2f1444488222","Type":"ContainerStarted","Data":"1e41f31bb6c8097df3496b7bb97aa8a4e8ed341136585b6e091bbc0169a75d35"} Mar 13 09:27:32 crc kubenswrapper[4841]: I0313 09:27:32.505825 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lpxn6" event={"ID":"8a43b722-1514-4a29-8935-2f1444488222","Type":"ContainerStarted","Data":"77b241ca2e27386ebdbd8f9e36ff2695326d3000fba48fb532f49a24e256456c"} Mar 13 09:27:32 crc kubenswrapper[4841]: I0313 09:27:32.506208 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:32 crc kubenswrapper[4841]: I0313 09:27:32.545031 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-lpxn6" podStartSLOduration=6.102221236 podStartE2EDuration="13.545007728s" podCreationTimestamp="2026-03-13 09:27:19 +0000 UTC" firstStartedPulling="2026-03-13 09:27:20.005253115 +0000 UTC m=+922.735153306" lastFinishedPulling="2026-03-13 09:27:27.448039587 +0000 UTC m=+930.177939798" observedRunningTime="2026-03-13 09:27:32.542719336 +0000 UTC m=+935.272619567" watchObservedRunningTime="2026-03-13 09:27:32.545007728 +0000 UTC m=+935.274907959" Mar 13 09:27:34 crc kubenswrapper[4841]: I0313 09:27:34.297366 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-g85zf"] Mar 13 09:27:34 crc kubenswrapper[4841]: I0313 09:27:34.303004 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-g85zf" Mar 13 09:27:34 crc kubenswrapper[4841]: I0313 09:27:34.310914 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 13 09:27:34 crc kubenswrapper[4841]: I0313 09:27:34.310933 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 13 09:27:34 crc kubenswrapper[4841]: I0313 09:27:34.324768 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-g85zf"] Mar 13 09:27:34 crc kubenswrapper[4841]: I0313 09:27:34.325833 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mhk2\" (UniqueName: \"kubernetes.io/projected/505af1e0-610a-469a-ba1f-c6d76f335c20-kube-api-access-5mhk2\") pod \"openstack-operator-index-g85zf\" (UID: \"505af1e0-610a-469a-ba1f-c6d76f335c20\") " pod="openstack-operators/openstack-operator-index-g85zf" Mar 13 09:27:34 crc kubenswrapper[4841]: I0313 09:27:34.427315 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mhk2\" (UniqueName: \"kubernetes.io/projected/505af1e0-610a-469a-ba1f-c6d76f335c20-kube-api-access-5mhk2\") pod \"openstack-operator-index-g85zf\" (UID: \"505af1e0-610a-469a-ba1f-c6d76f335c20\") " pod="openstack-operators/openstack-operator-index-g85zf" Mar 13 09:27:34 crc kubenswrapper[4841]: I0313 09:27:34.446822 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mhk2\" (UniqueName: \"kubernetes.io/projected/505af1e0-610a-469a-ba1f-c6d76f335c20-kube-api-access-5mhk2\") pod \"openstack-operator-index-g85zf\" (UID: \"505af1e0-610a-469a-ba1f-c6d76f335c20\") " pod="openstack-operators/openstack-operator-index-g85zf" Mar 13 09:27:34 crc kubenswrapper[4841]: I0313 09:27:34.634975 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-g85zf" Mar 13 09:27:34 crc kubenswrapper[4841]: I0313 09:27:34.897860 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:34 crc kubenswrapper[4841]: I0313 09:27:34.949166 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:35 crc kubenswrapper[4841]: I0313 09:27:35.046985 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-g85zf"] Mar 13 09:27:35 crc kubenswrapper[4841]: I0313 09:27:35.526002 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-g85zf" event={"ID":"505af1e0-610a-469a-ba1f-c6d76f335c20","Type":"ContainerStarted","Data":"02d2f59846c07605176cd272ba1f1c1699e83000d03568a057fded6680776b7e"} Mar 13 09:27:37 crc kubenswrapper[4841]: I0313 09:27:37.544714 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-g85zf" event={"ID":"505af1e0-610a-469a-ba1f-c6d76f335c20","Type":"ContainerStarted","Data":"1eebd61dd6f7761a6884d6872ff3062fbf6a66ee091434c937b501f672de5f60"} Mar 13 09:27:37 crc kubenswrapper[4841]: I0313 09:27:37.579601 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-g85zf" podStartSLOduration=1.497421545 podStartE2EDuration="3.579571777s" podCreationTimestamp="2026-03-13 09:27:34 +0000 UTC" firstStartedPulling="2026-03-13 09:27:35.058919828 +0000 UTC m=+937.788820049" lastFinishedPulling="2026-03-13 09:27:37.14107009 +0000 UTC m=+939.870970281" observedRunningTime="2026-03-13 09:27:37.569238076 +0000 UTC m=+940.299138307" watchObservedRunningTime="2026-03-13 09:27:37.579571777 +0000 UTC m=+940.309471998" Mar 13 09:27:37 crc kubenswrapper[4841]: I0313 09:27:37.668304 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-g85zf"] Mar 13 09:27:38 crc kubenswrapper[4841]: I0313 09:27:38.277697 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-n25n5"] Mar 13 09:27:38 crc kubenswrapper[4841]: I0313 09:27:38.278903 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-n25n5" Mar 13 09:27:38 crc kubenswrapper[4841]: I0313 09:27:38.281625 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-jzcdr" Mar 13 09:27:38 crc kubenswrapper[4841]: I0313 09:27:38.294217 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-n25n5"] Mar 13 09:27:38 crc kubenswrapper[4841]: I0313 09:27:38.386692 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wfl4\" (UniqueName: \"kubernetes.io/projected/7290f225-3489-4643-916d-39a67a36acb2-kube-api-access-2wfl4\") pod \"openstack-operator-index-n25n5\" (UID: \"7290f225-3489-4643-916d-39a67a36acb2\") " pod="openstack-operators/openstack-operator-index-n25n5" Mar 13 09:27:38 crc kubenswrapper[4841]: I0313 09:27:38.487870 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wfl4\" (UniqueName: \"kubernetes.io/projected/7290f225-3489-4643-916d-39a67a36acb2-kube-api-access-2wfl4\") pod \"openstack-operator-index-n25n5\" (UID: \"7290f225-3489-4643-916d-39a67a36acb2\") " pod="openstack-operators/openstack-operator-index-n25n5" Mar 13 09:27:38 crc kubenswrapper[4841]: I0313 09:27:38.526053 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wfl4\" (UniqueName: \"kubernetes.io/projected/7290f225-3489-4643-916d-39a67a36acb2-kube-api-access-2wfl4\") pod \"openstack-operator-index-n25n5\" (UID: \"7290f225-3489-4643-916d-39a67a36acb2\") " pod="openstack-operators/openstack-operator-index-n25n5" Mar 13 09:27:38 crc kubenswrapper[4841]: I0313 09:27:38.610020 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-n25n5" Mar 13 09:27:38 crc kubenswrapper[4841]: I0313 09:27:38.855292 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-n25n5"] Mar 13 09:27:38 crc kubenswrapper[4841]: W0313 09:27:38.862692 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7290f225_3489_4643_916d_39a67a36acb2.slice/crio-ee1d0974dd916b481a5ff27cc0ba56c47c4ff90b7a0b7202c5b3ecd7b058d9c9 WatchSource:0}: Error finding container ee1d0974dd916b481a5ff27cc0ba56c47c4ff90b7a0b7202c5b3ecd7b058d9c9: Status 404 returned error can't find the container with id ee1d0974dd916b481a5ff27cc0ba56c47c4ff90b7a0b7202c5b3ecd7b058d9c9 Mar 13 09:27:39 crc kubenswrapper[4841]: I0313 09:27:39.560297 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-n25n5" event={"ID":"7290f225-3489-4643-916d-39a67a36acb2","Type":"ContainerStarted","Data":"97df9f6ba99401ae07d059241bd6f0d5a7a527e15fb858c3fc35dda77ad3dc30"} Mar 13 09:27:39 crc kubenswrapper[4841]: I0313 09:27:39.560349 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-n25n5" event={"ID":"7290f225-3489-4643-916d-39a67a36acb2","Type":"ContainerStarted","Data":"ee1d0974dd916b481a5ff27cc0ba56c47c4ff90b7a0b7202c5b3ecd7b058d9c9"} Mar 13 09:27:39 crc kubenswrapper[4841]: I0313 09:27:39.560549 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-g85zf" podUID="505af1e0-610a-469a-ba1f-c6d76f335c20" containerName="registry-server" containerID="cri-o://1eebd61dd6f7761a6884d6872ff3062fbf6a66ee091434c937b501f672de5f60" gracePeriod=2 Mar 13 09:27:39 crc kubenswrapper[4841]: I0313 09:27:39.586156 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-n25n5" podStartSLOduration=1.528541037 podStartE2EDuration="1.586130179s" podCreationTimestamp="2026-03-13 09:27:38 +0000 UTC" firstStartedPulling="2026-03-13 09:27:38.867259612 +0000 UTC m=+941.597159843" lastFinishedPulling="2026-03-13 09:27:38.924848754 +0000 UTC m=+941.654748985" observedRunningTime="2026-03-13 09:27:39.579517703 +0000 UTC m=+942.309417924" watchObservedRunningTime="2026-03-13 09:27:39.586130179 +0000 UTC m=+942.316030390" Mar 13 09:27:39 crc kubenswrapper[4841]: I0313 09:27:39.987721 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-g85zf" Mar 13 09:27:40 crc kubenswrapper[4841]: I0313 09:27:40.017233 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-6x8lt" Mar 13 09:27:40 crc kubenswrapper[4841]: I0313 09:27:40.108316 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mhk2\" (UniqueName: \"kubernetes.io/projected/505af1e0-610a-469a-ba1f-c6d76f335c20-kube-api-access-5mhk2\") pod \"505af1e0-610a-469a-ba1f-c6d76f335c20\" (UID: \"505af1e0-610a-469a-ba1f-c6d76f335c20\") " Mar 13 09:27:40 crc kubenswrapper[4841]: I0313 09:27:40.116459 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/505af1e0-610a-469a-ba1f-c6d76f335c20-kube-api-access-5mhk2" (OuterVolumeSpecName: "kube-api-access-5mhk2") pod "505af1e0-610a-469a-ba1f-c6d76f335c20" (UID: "505af1e0-610a-469a-ba1f-c6d76f335c20"). InnerVolumeSpecName "kube-api-access-5mhk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:27:40 crc kubenswrapper[4841]: I0313 09:27:40.210499 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mhk2\" (UniqueName: \"kubernetes.io/projected/505af1e0-610a-469a-ba1f-c6d76f335c20-kube-api-access-5mhk2\") on node \"crc\" DevicePath \"\"" Mar 13 09:27:40 crc kubenswrapper[4841]: I0313 09:27:40.510753 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qnhkg" Mar 13 09:27:40 crc kubenswrapper[4841]: I0313 09:27:40.569309 4841 generic.go:334] "Generic (PLEG): container finished" podID="505af1e0-610a-469a-ba1f-c6d76f335c20" containerID="1eebd61dd6f7761a6884d6872ff3062fbf6a66ee091434c937b501f672de5f60" exitCode=0 Mar 13 09:27:40 crc kubenswrapper[4841]: I0313 09:27:40.569417 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-g85zf" event={"ID":"505af1e0-610a-469a-ba1f-c6d76f335c20","Type":"ContainerDied","Data":"1eebd61dd6f7761a6884d6872ff3062fbf6a66ee091434c937b501f672de5f60"} Mar 13 09:27:40 crc kubenswrapper[4841]: I0313 09:27:40.569495 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-g85zf" event={"ID":"505af1e0-610a-469a-ba1f-c6d76f335c20","Type":"ContainerDied","Data":"02d2f59846c07605176cd272ba1f1c1699e83000d03568a057fded6680776b7e"} Mar 13 09:27:40 crc kubenswrapper[4841]: I0313 09:27:40.569523 4841 scope.go:117] "RemoveContainer" containerID="1eebd61dd6f7761a6884d6872ff3062fbf6a66ee091434c937b501f672de5f60" Mar 13 09:27:40 crc kubenswrapper[4841]: I0313 09:27:40.569443 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-g85zf" Mar 13 09:27:40 crc kubenswrapper[4841]: I0313 09:27:40.597432 4841 scope.go:117] "RemoveContainer" containerID="1eebd61dd6f7761a6884d6872ff3062fbf6a66ee091434c937b501f672de5f60" Mar 13 09:27:40 crc kubenswrapper[4841]: E0313 09:27:40.597895 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eebd61dd6f7761a6884d6872ff3062fbf6a66ee091434c937b501f672de5f60\": container with ID starting with 1eebd61dd6f7761a6884d6872ff3062fbf6a66ee091434c937b501f672de5f60 not found: ID does not exist" containerID="1eebd61dd6f7761a6884d6872ff3062fbf6a66ee091434c937b501f672de5f60" Mar 13 09:27:40 crc kubenswrapper[4841]: I0313 09:27:40.597938 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eebd61dd6f7761a6884d6872ff3062fbf6a66ee091434c937b501f672de5f60"} err="failed to get container status \"1eebd61dd6f7761a6884d6872ff3062fbf6a66ee091434c937b501f672de5f60\": rpc error: code = NotFound desc = could not find container \"1eebd61dd6f7761a6884d6872ff3062fbf6a66ee091434c937b501f672de5f60\": container with ID starting with 1eebd61dd6f7761a6884d6872ff3062fbf6a66ee091434c937b501f672de5f60 not found: ID does not exist" Mar 13 09:27:40 crc kubenswrapper[4841]: I0313 09:27:40.607384 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-g85zf"] Mar 13 09:27:40 crc kubenswrapper[4841]: I0313 09:27:40.614041 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-g85zf"] Mar 13 09:27:42 crc kubenswrapper[4841]: I0313 09:27:42.002612 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="505af1e0-610a-469a-ba1f-c6d76f335c20" path="/var/lib/kubelet/pods/505af1e0-610a-469a-ba1f-c6d76f335c20/volumes" Mar 13 09:27:48 crc kubenswrapper[4841]: I0313 09:27:48.611065 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-n25n5" Mar 13 09:27:48 crc kubenswrapper[4841]: I0313 09:27:48.612350 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-n25n5" Mar 13 09:27:48 crc kubenswrapper[4841]: I0313 09:27:48.654872 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-n25n5" Mar 13 09:27:48 crc kubenswrapper[4841]: I0313 09:27:48.688892 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-n25n5" Mar 13 09:27:49 crc kubenswrapper[4841]: I0313 09:27:49.900208 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-lpxn6" Mar 13 09:27:55 crc kubenswrapper[4841]: I0313 09:27:55.912880 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq"] Mar 13 09:27:55 crc kubenswrapper[4841]: E0313 09:27:55.913469 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="505af1e0-610a-469a-ba1f-c6d76f335c20" containerName="registry-server" Mar 13 09:27:55 crc kubenswrapper[4841]: I0313 09:27:55.913483 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="505af1e0-610a-469a-ba1f-c6d76f335c20" containerName="registry-server" Mar 13 09:27:55 crc kubenswrapper[4841]: I0313 09:27:55.913621 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="505af1e0-610a-469a-ba1f-c6d76f335c20" containerName="registry-server" Mar 13 09:27:55 crc kubenswrapper[4841]: I0313 09:27:55.914611 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq" Mar 13 09:27:55 crc kubenswrapper[4841]: I0313 09:27:55.916967 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-5zvcx" Mar 13 09:27:55 crc kubenswrapper[4841]: I0313 09:27:55.937597 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq"] Mar 13 09:27:56 crc kubenswrapper[4841]: I0313 09:27:56.042083 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8b2962c-7f7d-4d5b-9982-1668d185c680-util\") pod \"230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq\" (UID: \"a8b2962c-7f7d-4d5b-9982-1668d185c680\") " pod="openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq" Mar 13 09:27:56 crc kubenswrapper[4841]: I0313 09:27:56.042144 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjxb7\" (UniqueName: \"kubernetes.io/projected/a8b2962c-7f7d-4d5b-9982-1668d185c680-kube-api-access-mjxb7\") pod \"230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq\" (UID: \"a8b2962c-7f7d-4d5b-9982-1668d185c680\") " pod="openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq" Mar 13 09:27:56 crc kubenswrapper[4841]: I0313 09:27:56.042209 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8b2962c-7f7d-4d5b-9982-1668d185c680-bundle\") pod \"230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq\" (UID: \"a8b2962c-7f7d-4d5b-9982-1668d185c680\") " pod="openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq" Mar 13 09:27:56 crc kubenswrapper[4841]: I0313 09:27:56.144303 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8b2962c-7f7d-4d5b-9982-1668d185c680-bundle\") pod \"230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq\" (UID: \"a8b2962c-7f7d-4d5b-9982-1668d185c680\") " pod="openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq" Mar 13 09:27:56 crc kubenswrapper[4841]: I0313 09:27:56.144389 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8b2962c-7f7d-4d5b-9982-1668d185c680-util\") pod \"230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq\" (UID: \"a8b2962c-7f7d-4d5b-9982-1668d185c680\") " pod="openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq" Mar 13 09:27:56 crc kubenswrapper[4841]: I0313 09:27:56.144423 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjxb7\" (UniqueName: \"kubernetes.io/projected/a8b2962c-7f7d-4d5b-9982-1668d185c680-kube-api-access-mjxb7\") pod \"230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq\" (UID: \"a8b2962c-7f7d-4d5b-9982-1668d185c680\") " pod="openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq" Mar 13 09:27:56 crc kubenswrapper[4841]: I0313 09:27:56.145406 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8b2962c-7f7d-4d5b-9982-1668d185c680-bundle\") pod \"230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq\" (UID: \"a8b2962c-7f7d-4d5b-9982-1668d185c680\") " pod="openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq" Mar 13 09:27:56 crc kubenswrapper[4841]: I0313 09:27:56.145975 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8b2962c-7f7d-4d5b-9982-1668d185c680-util\") pod \"230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq\" (UID: \"a8b2962c-7f7d-4d5b-9982-1668d185c680\") " pod="openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq" Mar 13 09:27:56 crc kubenswrapper[4841]: I0313 09:27:56.180415 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjxb7\" (UniqueName: \"kubernetes.io/projected/a8b2962c-7f7d-4d5b-9982-1668d185c680-kube-api-access-mjxb7\") pod \"230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq\" (UID: \"a8b2962c-7f7d-4d5b-9982-1668d185c680\") " pod="openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq" Mar 13 09:27:56 crc kubenswrapper[4841]: I0313 09:27:56.238865 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq" Mar 13 09:27:56 crc kubenswrapper[4841]: I0313 09:27:56.742050 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq"] Mar 13 09:27:56 crc kubenswrapper[4841]: W0313 09:27:56.748218 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8b2962c_7f7d_4d5b_9982_1668d185c680.slice/crio-bd2f7ff7a67ba1126d45f37549c0be259676cd6d2bc2bafb49b572788edaefcb WatchSource:0}: Error finding container bd2f7ff7a67ba1126d45f37549c0be259676cd6d2bc2bafb49b572788edaefcb: Status 404 returned error can't find the container with id bd2f7ff7a67ba1126d45f37549c0be259676cd6d2bc2bafb49b572788edaefcb Mar 13 09:27:57 crc kubenswrapper[4841]: I0313 09:27:57.702532 4841 generic.go:334] "Generic (PLEG): container finished" podID="a8b2962c-7f7d-4d5b-9982-1668d185c680" containerID="ca46162120153934c10860d7abfb4a7011f35b4419d90571623b6a98197fc4c3" exitCode=0 Mar 13 09:27:57 crc kubenswrapper[4841]: I0313 09:27:57.702593 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq" event={"ID":"a8b2962c-7f7d-4d5b-9982-1668d185c680","Type":"ContainerDied","Data":"ca46162120153934c10860d7abfb4a7011f35b4419d90571623b6a98197fc4c3"} Mar 13 09:27:57 crc kubenswrapper[4841]: I0313 09:27:57.702772 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq" event={"ID":"a8b2962c-7f7d-4d5b-9982-1668d185c680","Type":"ContainerStarted","Data":"bd2f7ff7a67ba1126d45f37549c0be259676cd6d2bc2bafb49b572788edaefcb"} Mar 13 09:27:58 crc kubenswrapper[4841]: I0313 09:27:58.713786 4841 generic.go:334] "Generic (PLEG): container finished" podID="a8b2962c-7f7d-4d5b-9982-1668d185c680" containerID="167edb76b00c50d7683f0cd2b416bf5a0496eab5a595b9af993b8860ec874295" exitCode=0 Mar 13 09:27:58 crc kubenswrapper[4841]: I0313 09:27:58.713887 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq" event={"ID":"a8b2962c-7f7d-4d5b-9982-1668d185c680","Type":"ContainerDied","Data":"167edb76b00c50d7683f0cd2b416bf5a0496eab5a595b9af993b8860ec874295"} Mar 13 09:27:59 crc kubenswrapper[4841]: I0313 09:27:59.725535 4841 generic.go:334] "Generic (PLEG): container finished" podID="a8b2962c-7f7d-4d5b-9982-1668d185c680" containerID="cc6ef54a7edae20738a476cc103832ffe87531b2d9dc25d393ae6e5dfaecdffa" exitCode=0 Mar 13 09:27:59 crc kubenswrapper[4841]: I0313 09:27:59.725645 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq" event={"ID":"a8b2962c-7f7d-4d5b-9982-1668d185c680","Type":"ContainerDied","Data":"cc6ef54a7edae20738a476cc103832ffe87531b2d9dc25d393ae6e5dfaecdffa"} Mar 13 09:28:00 crc kubenswrapper[4841]: I0313 09:28:00.152961 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556568-9n5mn"] Mar 13 09:28:00 crc kubenswrapper[4841]: I0313 09:28:00.154163 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556568-9n5mn" Mar 13 09:28:00 crc kubenswrapper[4841]: I0313 09:28:00.156953 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:28:00 crc kubenswrapper[4841]: I0313 09:28:00.157199 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:28:00 crc kubenswrapper[4841]: I0313 09:28:00.157429 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:28:00 crc kubenswrapper[4841]: I0313 09:28:00.164297 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556568-9n5mn"] Mar 13 09:28:00 crc kubenswrapper[4841]: I0313 09:28:00.311456 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjdfr\" (UniqueName: \"kubernetes.io/projected/aceaa228-acf8-4fad-b348-2f26e2225a80-kube-api-access-vjdfr\") pod \"auto-csr-approver-29556568-9n5mn\" (UID: \"aceaa228-acf8-4fad-b348-2f26e2225a80\") " pod="openshift-infra/auto-csr-approver-29556568-9n5mn" Mar 13 09:28:00 crc kubenswrapper[4841]: I0313 09:28:00.412744 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjdfr\" (UniqueName: \"kubernetes.io/projected/aceaa228-acf8-4fad-b348-2f26e2225a80-kube-api-access-vjdfr\") pod \"auto-csr-approver-29556568-9n5mn\" (UID: \"aceaa228-acf8-4fad-b348-2f26e2225a80\") " pod="openshift-infra/auto-csr-approver-29556568-9n5mn" Mar 13 09:28:00 crc kubenswrapper[4841]: I0313 09:28:00.436414 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjdfr\" (UniqueName: \"kubernetes.io/projected/aceaa228-acf8-4fad-b348-2f26e2225a80-kube-api-access-vjdfr\") pod \"auto-csr-approver-29556568-9n5mn\" (UID: \"aceaa228-acf8-4fad-b348-2f26e2225a80\") " pod="openshift-infra/auto-csr-approver-29556568-9n5mn" Mar 13 09:28:00 crc kubenswrapper[4841]: I0313 09:28:00.474458 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556568-9n5mn" Mar 13 09:28:00 crc kubenswrapper[4841]: I0313 09:28:00.699287 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556568-9n5mn"] Mar 13 09:28:00 crc kubenswrapper[4841]: W0313 09:28:00.705210 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaceaa228_acf8_4fad_b348_2f26e2225a80.slice/crio-0a15ccd851487320f3d62c506489663a8ee0d04e9cfd57ead40e0bbe700c0ce5 WatchSource:0}: Error finding container 0a15ccd851487320f3d62c506489663a8ee0d04e9cfd57ead40e0bbe700c0ce5: Status 404 returned error can't find the container with id 0a15ccd851487320f3d62c506489663a8ee0d04e9cfd57ead40e0bbe700c0ce5 Mar 13 09:28:00 crc kubenswrapper[4841]: I0313 09:28:00.731465 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556568-9n5mn" event={"ID":"aceaa228-acf8-4fad-b348-2f26e2225a80","Type":"ContainerStarted","Data":"0a15ccd851487320f3d62c506489663a8ee0d04e9cfd57ead40e0bbe700c0ce5"} Mar 13 09:28:00 crc kubenswrapper[4841]: I0313 09:28:00.977948 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq" Mar 13 09:28:01 crc kubenswrapper[4841]: I0313 09:28:01.121746 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8b2962c-7f7d-4d5b-9982-1668d185c680-bundle\") pod \"a8b2962c-7f7d-4d5b-9982-1668d185c680\" (UID: \"a8b2962c-7f7d-4d5b-9982-1668d185c680\") " Mar 13 09:28:01 crc kubenswrapper[4841]: I0313 09:28:01.121807 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjxb7\" (UniqueName: \"kubernetes.io/projected/a8b2962c-7f7d-4d5b-9982-1668d185c680-kube-api-access-mjxb7\") pod \"a8b2962c-7f7d-4d5b-9982-1668d185c680\" (UID: \"a8b2962c-7f7d-4d5b-9982-1668d185c680\") " Mar 13 09:28:01 crc kubenswrapper[4841]: I0313 09:28:01.121827 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8b2962c-7f7d-4d5b-9982-1668d185c680-util\") pod \"a8b2962c-7f7d-4d5b-9982-1668d185c680\" (UID: \"a8b2962c-7f7d-4d5b-9982-1668d185c680\") " Mar 13 09:28:01 crc kubenswrapper[4841]: I0313 09:28:01.123496 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8b2962c-7f7d-4d5b-9982-1668d185c680-bundle" (OuterVolumeSpecName: "bundle") pod "a8b2962c-7f7d-4d5b-9982-1668d185c680" (UID: "a8b2962c-7f7d-4d5b-9982-1668d185c680"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:28:01 crc kubenswrapper[4841]: I0313 09:28:01.130047 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8b2962c-7f7d-4d5b-9982-1668d185c680-kube-api-access-mjxb7" (OuterVolumeSpecName: "kube-api-access-mjxb7") pod "a8b2962c-7f7d-4d5b-9982-1668d185c680" (UID: "a8b2962c-7f7d-4d5b-9982-1668d185c680"). InnerVolumeSpecName "kube-api-access-mjxb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:28:01 crc kubenswrapper[4841]: I0313 09:28:01.152816 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8b2962c-7f7d-4d5b-9982-1668d185c680-util" (OuterVolumeSpecName: "util") pod "a8b2962c-7f7d-4d5b-9982-1668d185c680" (UID: "a8b2962c-7f7d-4d5b-9982-1668d185c680"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:28:01 crc kubenswrapper[4841]: I0313 09:28:01.224055 4841 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8b2962c-7f7d-4d5b-9982-1668d185c680-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:28:01 crc kubenswrapper[4841]: I0313 09:28:01.224093 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjxb7\" (UniqueName: \"kubernetes.io/projected/a8b2962c-7f7d-4d5b-9982-1668d185c680-kube-api-access-mjxb7\") on node \"crc\" DevicePath \"\"" Mar 13 09:28:01 crc kubenswrapper[4841]: I0313 09:28:01.224106 4841 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8b2962c-7f7d-4d5b-9982-1668d185c680-util\") on node \"crc\" DevicePath \"\"" Mar 13 09:28:01 crc kubenswrapper[4841]: I0313 09:28:01.747968 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq" event={"ID":"a8b2962c-7f7d-4d5b-9982-1668d185c680","Type":"ContainerDied","Data":"bd2f7ff7a67ba1126d45f37549c0be259676cd6d2bc2bafb49b572788edaefcb"} Mar 13 09:28:01 crc kubenswrapper[4841]: I0313 09:28:01.748012 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd2f7ff7a67ba1126d45f37549c0be259676cd6d2bc2bafb49b572788edaefcb" Mar 13 09:28:01 crc kubenswrapper[4841]: I0313 09:28:01.748063 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq" Mar 13 09:28:02 crc kubenswrapper[4841]: I0313 09:28:02.756408 4841 generic.go:334] "Generic (PLEG): container finished" podID="aceaa228-acf8-4fad-b348-2f26e2225a80" containerID="b93289622c1c831313f1861efcd99f106369f240a30eeb9fc573964c93ecc6b3" exitCode=0 Mar 13 09:28:02 crc kubenswrapper[4841]: I0313 09:28:02.756506 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556568-9n5mn" event={"ID":"aceaa228-acf8-4fad-b348-2f26e2225a80","Type":"ContainerDied","Data":"b93289622c1c831313f1861efcd99f106369f240a30eeb9fc573964c93ecc6b3"} Mar 13 09:28:04 crc kubenswrapper[4841]: I0313 09:28:04.065942 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556568-9n5mn" Mar 13 09:28:04 crc kubenswrapper[4841]: I0313 09:28:04.166500 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjdfr\" (UniqueName: \"kubernetes.io/projected/aceaa228-acf8-4fad-b348-2f26e2225a80-kube-api-access-vjdfr\") pod \"aceaa228-acf8-4fad-b348-2f26e2225a80\" (UID: \"aceaa228-acf8-4fad-b348-2f26e2225a80\") " Mar 13 09:28:04 crc kubenswrapper[4841]: I0313 09:28:04.174639 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aceaa228-acf8-4fad-b348-2f26e2225a80-kube-api-access-vjdfr" (OuterVolumeSpecName: "kube-api-access-vjdfr") pod "aceaa228-acf8-4fad-b348-2f26e2225a80" (UID: "aceaa228-acf8-4fad-b348-2f26e2225a80"). InnerVolumeSpecName "kube-api-access-vjdfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:28:04 crc kubenswrapper[4841]: I0313 09:28:04.272471 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjdfr\" (UniqueName: \"kubernetes.io/projected/aceaa228-acf8-4fad-b348-2f26e2225a80-kube-api-access-vjdfr\") on node \"crc\" DevicePath \"\"" Mar 13 09:28:04 crc kubenswrapper[4841]: I0313 09:28:04.407837 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:28:04 crc kubenswrapper[4841]: I0313 09:28:04.407912 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:28:04 crc kubenswrapper[4841]: I0313 09:28:04.780614 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556568-9n5mn" event={"ID":"aceaa228-acf8-4fad-b348-2f26e2225a80","Type":"ContainerDied","Data":"0a15ccd851487320f3d62c506489663a8ee0d04e9cfd57ead40e0bbe700c0ce5"} Mar 13 09:28:04 crc kubenswrapper[4841]: I0313 09:28:04.780651 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a15ccd851487320f3d62c506489663a8ee0d04e9cfd57ead40e0bbe700c0ce5" Mar 13 09:28:04 crc kubenswrapper[4841]: I0313 09:28:04.780701 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556568-9n5mn" Mar 13 09:28:05 crc kubenswrapper[4841]: I0313 09:28:05.155418 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556562-265fg"] Mar 13 09:28:05 crc kubenswrapper[4841]: I0313 09:28:05.162746 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556562-265fg"] Mar 13 09:28:06 crc kubenswrapper[4841]: I0313 09:28:06.009918 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62838214-0b31-49b3-bac8-7c0c8ce58141" path="/var/lib/kubelet/pods/62838214-0b31-49b3-bac8-7c0c8ce58141/volumes" Mar 13 09:28:08 crc kubenswrapper[4841]: I0313 09:28:08.774862 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6bc596d67-h66cj"] Mar 13 09:28:08 crc kubenswrapper[4841]: E0313 09:28:08.775410 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aceaa228-acf8-4fad-b348-2f26e2225a80" containerName="oc" Mar 13 09:28:08 crc kubenswrapper[4841]: I0313 09:28:08.775423 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="aceaa228-acf8-4fad-b348-2f26e2225a80" containerName="oc" Mar 13 09:28:08 crc kubenswrapper[4841]: E0313 09:28:08.775433 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8b2962c-7f7d-4d5b-9982-1668d185c680" containerName="pull" Mar 13 09:28:08 crc kubenswrapper[4841]: I0313 09:28:08.775439 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b2962c-7f7d-4d5b-9982-1668d185c680" containerName="pull" Mar 13 09:28:08 crc kubenswrapper[4841]: E0313 09:28:08.775449 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8b2962c-7f7d-4d5b-9982-1668d185c680" containerName="extract" Mar 13 09:28:08 crc kubenswrapper[4841]: I0313 09:28:08.775455 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b2962c-7f7d-4d5b-9982-1668d185c680" containerName="extract" Mar 13 09:28:08 crc kubenswrapper[4841]: E0313 09:28:08.775469 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8b2962c-7f7d-4d5b-9982-1668d185c680" containerName="util" Mar 13 09:28:08 crc kubenswrapper[4841]: I0313 09:28:08.775474 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b2962c-7f7d-4d5b-9982-1668d185c680" containerName="util" Mar 13 09:28:08 crc kubenswrapper[4841]: I0313 09:28:08.775574 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8b2962c-7f7d-4d5b-9982-1668d185c680" containerName="extract" Mar 13 09:28:08 crc kubenswrapper[4841]: I0313 09:28:08.775588 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="aceaa228-acf8-4fad-b348-2f26e2225a80" containerName="oc" Mar 13 09:28:08 crc kubenswrapper[4841]: I0313 09:28:08.775935 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6bc596d67-h66cj" Mar 13 09:28:08 crc kubenswrapper[4841]: I0313 09:28:08.778951 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-698rc" Mar 13 09:28:08 crc kubenswrapper[4841]: I0313 09:28:08.805198 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6bc596d67-h66cj"] Mar 13 09:28:08 crc kubenswrapper[4841]: I0313 09:28:08.954652 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fdc6\" (UniqueName: \"kubernetes.io/projected/08c8df77-ecb6-4f32-b9a1-b31bf7a0d1c4-kube-api-access-6fdc6\") pod \"openstack-operator-controller-init-6bc596d67-h66cj\" (UID: \"08c8df77-ecb6-4f32-b9a1-b31bf7a0d1c4\") " pod="openstack-operators/openstack-operator-controller-init-6bc596d67-h66cj" Mar 13 09:28:09 crc kubenswrapper[4841]: I0313 09:28:09.055943 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fdc6\" (UniqueName: \"kubernetes.io/projected/08c8df77-ecb6-4f32-b9a1-b31bf7a0d1c4-kube-api-access-6fdc6\") pod \"openstack-operator-controller-init-6bc596d67-h66cj\" (UID: \"08c8df77-ecb6-4f32-b9a1-b31bf7a0d1c4\") " pod="openstack-operators/openstack-operator-controller-init-6bc596d67-h66cj" Mar 13 09:28:09 crc kubenswrapper[4841]: I0313 09:28:09.076771 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fdc6\" (UniqueName: \"kubernetes.io/projected/08c8df77-ecb6-4f32-b9a1-b31bf7a0d1c4-kube-api-access-6fdc6\") pod \"openstack-operator-controller-init-6bc596d67-h66cj\" (UID: \"08c8df77-ecb6-4f32-b9a1-b31bf7a0d1c4\") " pod="openstack-operators/openstack-operator-controller-init-6bc596d67-h66cj" Mar 13 09:28:09 crc kubenswrapper[4841]: I0313 09:28:09.093123 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6bc596d67-h66cj" Mar 13 09:28:09 crc kubenswrapper[4841]: I0313 09:28:09.526903 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6bc596d67-h66cj"] Mar 13 09:28:09 crc kubenswrapper[4841]: I0313 09:28:09.812814 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6bc596d67-h66cj" event={"ID":"08c8df77-ecb6-4f32-b9a1-b31bf7a0d1c4","Type":"ContainerStarted","Data":"a1b20de41bd01b26a2c885b6fe6ab2919394966d8cd16f563ad76de8b5e463f0"} Mar 13 09:28:13 crc kubenswrapper[4841]: I0313 09:28:13.844795 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6bc596d67-h66cj" event={"ID":"08c8df77-ecb6-4f32-b9a1-b31bf7a0d1c4","Type":"ContainerStarted","Data":"0b8ae822054a77bee5339cadf2a50f9194090d737fc025b9523b398cbbba2051"} Mar 13 09:28:13 crc kubenswrapper[4841]: I0313 09:28:13.845474 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6bc596d67-h66cj" Mar 13 09:28:13 crc kubenswrapper[4841]: I0313 09:28:13.875929 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6bc596d67-h66cj" podStartSLOduration=1.908003618 podStartE2EDuration="5.875904588s" podCreationTimestamp="2026-03-13 09:28:08 +0000 UTC" firstStartedPulling="2026-03-13 09:28:09.535710115 +0000 UTC m=+972.265610346" lastFinishedPulling="2026-03-13 09:28:13.503611135 +0000 UTC m=+976.233511316" observedRunningTime="2026-03-13 09:28:13.87402233 +0000 UTC m=+976.603922561" watchObservedRunningTime="2026-03-13 09:28:13.875904588 +0000 UTC m=+976.605804819" Mar 13 09:28:19 crc kubenswrapper[4841]: I0313 09:28:19.096647 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6bc596d67-h66cj" Mar 13 09:28:34 crc kubenswrapper[4841]: I0313 09:28:34.407306 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:28:34 crc kubenswrapper[4841]: I0313 09:28:34.407907 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.440420 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-fkfjh"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.441893 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-fkfjh" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.452188 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-v4fw6" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.456458 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-cgkfz"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.457711 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-cgkfz" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.459562 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-7c5rn" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.462753 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-fkfjh"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.478976 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-cgkfz"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.515063 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-kc2zl"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.516050 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-kc2zl" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.517656 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-l9smz" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.526477 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-kc2zl"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.528606 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b42m\" (UniqueName: \"kubernetes.io/projected/7bd9dd2f-b4fd-4078-b463-4e970fa6791d-kube-api-access-9b42m\") pod \"cinder-operator-controller-manager-984cd4dcf-cgkfz\" (UID: \"7bd9dd2f-b4fd-4078-b463-4e970fa6791d\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-cgkfz" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.528675 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp99d\" (UniqueName: \"kubernetes.io/projected/9db8c27e-023c-4e28-a381-24f4438a6add-kube-api-access-bp99d\") pod \"barbican-operator-controller-manager-677bd678f7-fkfjh\" (UID: \"9db8c27e-023c-4e28-a381-24f4438a6add\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-fkfjh" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.538846 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-7lhjg"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.539629 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7lhjg" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.543644 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-hq2lk" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.544494 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-7lhjg"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.550830 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.551820 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.558328 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-hm824"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.558929 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.558947 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-2rtrk" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.559098 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-hm824" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.562741 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-qn8mk" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.568936 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.574404 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-hm824"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.581626 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-grn97"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.582372 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-grn97" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.591288 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-6qd85" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.592125 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-gb4dz"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.592912 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-gb4dz" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.594416 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-q8jjr" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.597876 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-grn97"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.605167 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-56nwc"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.606128 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-56nwc" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.609875 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-xtwjr" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.610072 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-gb4dz"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.615998 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-56nwc"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.627059 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-9gcr6"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.628306 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9gcr6" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.632217 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rx4wh"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.633165 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rx4wh" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.633634 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh2gw\" (UniqueName: \"kubernetes.io/projected/e6b9c8a5-3093-4d94-ad46-cd682158fdf8-kube-api-access-mh2gw\") pod \"heat-operator-controller-manager-77b6666d85-hm824\" (UID: \"e6b9c8a5-3093-4d94-ad46-cd682158fdf8\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-hm824" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.633837 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czzwk\" (UniqueName: \"kubernetes.io/projected/c6c9dfcd-5298-468b-9de2-0280bf525b61-kube-api-access-czzwk\") pod \"designate-operator-controller-manager-66d56f6ff4-kc2zl\" (UID: \"c6c9dfcd-5298-468b-9de2-0280bf525b61\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-kc2zl" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.633960 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b42m\" (UniqueName: \"kubernetes.io/projected/7bd9dd2f-b4fd-4078-b463-4e970fa6791d-kube-api-access-9b42m\") pod \"cinder-operator-controller-manager-984cd4dcf-cgkfz\" (UID: \"7bd9dd2f-b4fd-4078-b463-4e970fa6791d\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-cgkfz" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.634069 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert\") pod \"infra-operator-controller-manager-5995f4446f-vtz8l\" (UID: \"26236923-39c0-4b46-be0d-61f453533891\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.634183 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhqdk\" (UniqueName: \"kubernetes.io/projected/26236923-39c0-4b46-be0d-61f453533891-kube-api-access-lhqdk\") pod \"infra-operator-controller-manager-5995f4446f-vtz8l\" (UID: \"26236923-39c0-4b46-be0d-61f453533891\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.634109 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-rlg8k" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.636987 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-xxcr4" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.638668 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvqdp\" (UniqueName: \"kubernetes.io/projected/46ca7c55-bd68-4454-a014-85f81f1b5a60-kube-api-access-tvqdp\") pod \"glance-operator-controller-manager-5964f64c48-7lhjg\" (UID: \"46ca7c55-bd68-4454-a014-85f81f1b5a60\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7lhjg" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.638880 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp99d\" (UniqueName: \"kubernetes.io/projected/9db8c27e-023c-4e28-a381-24f4438a6add-kube-api-access-bp99d\") pod \"barbican-operator-controller-manager-677bd678f7-fkfjh\" (UID: \"9db8c27e-023c-4e28-a381-24f4438a6add\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-fkfjh" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.659902 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rx4wh"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.677888 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b42m\" (UniqueName: \"kubernetes.io/projected/7bd9dd2f-b4fd-4078-b463-4e970fa6791d-kube-api-access-9b42m\") pod \"cinder-operator-controller-manager-984cd4dcf-cgkfz\" (UID: \"7bd9dd2f-b4fd-4078-b463-4e970fa6791d\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-cgkfz" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.689994 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp99d\" (UniqueName: \"kubernetes.io/projected/9db8c27e-023c-4e28-a381-24f4438a6add-kube-api-access-bp99d\") pod \"barbican-operator-controller-manager-677bd678f7-fkfjh\" (UID: \"9db8c27e-023c-4e28-a381-24f4438a6add\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-fkfjh" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.713887 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-9gcr6"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.736671 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-8f2sj"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.737678 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-8f2sj" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.739630 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-mlh79" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.740793 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tctn5\" (UniqueName: \"kubernetes.io/projected/98cec4ba-d672-4627-8d37-46a0684fc284-kube-api-access-tctn5\") pod \"ironic-operator-controller-manager-6bbb499bbc-gb4dz\" (UID: \"98cec4ba-d672-4627-8d37-46a0684fc284\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-gb4dz" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.740840 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbzkb\" (UniqueName: \"kubernetes.io/projected/90c1dec3-4daa-4ac6-b95e-209cb8bd9b55-kube-api-access-bbzkb\") pod \"mariadb-operator-controller-manager-658d4cdd5-rx4wh\" (UID: \"90c1dec3-4daa-4ac6-b95e-209cb8bd9b55\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rx4wh" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.740886 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh2gw\" (UniqueName: \"kubernetes.io/projected/e6b9c8a5-3093-4d94-ad46-cd682158fdf8-kube-api-access-mh2gw\") pod \"heat-operator-controller-manager-77b6666d85-hm824\" (UID: \"e6b9c8a5-3093-4d94-ad46-cd682158fdf8\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-hm824" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.740925 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czzwk\" (UniqueName: \"kubernetes.io/projected/c6c9dfcd-5298-468b-9de2-0280bf525b61-kube-api-access-czzwk\") pod \"designate-operator-controller-manager-66d56f6ff4-kc2zl\" (UID: \"c6c9dfcd-5298-468b-9de2-0280bf525b61\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-kc2zl" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.740942 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwl2n\" (UniqueName: \"kubernetes.io/projected/bad08b57-dde0-496d-8ea1-5845a52d517a-kube-api-access-pwl2n\") pod \"horizon-operator-controller-manager-6d9d6b584d-grn97\" (UID: \"bad08b57-dde0-496d-8ea1-5845a52d517a\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-grn97" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.740958 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zppfn\" (UniqueName: \"kubernetes.io/projected/4a76403b-081b-4222-a707-4cd00dd440a0-kube-api-access-zppfn\") pod \"keystone-operator-controller-manager-684f77d66d-56nwc\" (UID: \"4a76403b-081b-4222-a707-4cd00dd440a0\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-56nwc" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.740975 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert\") pod \"infra-operator-controller-manager-5995f4446f-vtz8l\" (UID: \"26236923-39c0-4b46-be0d-61f453533891\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.740994 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhqdk\" (UniqueName: \"kubernetes.io/projected/26236923-39c0-4b46-be0d-61f453533891-kube-api-access-lhqdk\") pod \"infra-operator-controller-manager-5995f4446f-vtz8l\" (UID: \"26236923-39c0-4b46-be0d-61f453533891\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.741012 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npxfq\" (UniqueName: \"kubernetes.io/projected/44d006b0-b13e-49ce-8ff8-592f3d8798c1-kube-api-access-npxfq\") pod \"manila-operator-controller-manager-68f45f9d9f-9gcr6\" (UID: \"44d006b0-b13e-49ce-8ff8-592f3d8798c1\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9gcr6" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.741039 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvqdp\" (UniqueName: \"kubernetes.io/projected/46ca7c55-bd68-4454-a014-85f81f1b5a60-kube-api-access-tvqdp\") pod \"glance-operator-controller-manager-5964f64c48-7lhjg\" (UID: \"46ca7c55-bd68-4454-a014-85f81f1b5a60\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7lhjg" Mar 13 09:28:56 crc kubenswrapper[4841]: E0313 09:28:56.741458 4841 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 09:28:56 crc kubenswrapper[4841]: E0313 09:28:56.741498 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert podName:26236923-39c0-4b46-be0d-61f453533891 nodeName:}" failed. No retries permitted until 2026-03-13 09:28:57.241483309 +0000 UTC m=+1019.971383580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert") pod "infra-operator-controller-manager-5995f4446f-vtz8l" (UID: "26236923-39c0-4b46-be0d-61f453533891") : secret "infra-operator-webhook-server-cert" not found Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.748332 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-8f2sj"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.766555 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-fkfjh" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.769661 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-chf9m"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.769868 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhqdk\" (UniqueName: \"kubernetes.io/projected/26236923-39c0-4b46-be0d-61f453533891-kube-api-access-lhqdk\") pod \"infra-operator-controller-manager-5995f4446f-vtz8l\" (UID: \"26236923-39c0-4b46-be0d-61f453533891\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.770429 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-chf9m" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.771864 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-cgkfz" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.773665 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-d2thn" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.776108 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvqdp\" (UniqueName: \"kubernetes.io/projected/46ca7c55-bd68-4454-a014-85f81f1b5a60-kube-api-access-tvqdp\") pod \"glance-operator-controller-manager-5964f64c48-7lhjg\" (UID: \"46ca7c55-bd68-4454-a014-85f81f1b5a60\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7lhjg" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.782057 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh2gw\" (UniqueName: \"kubernetes.io/projected/e6b9c8a5-3093-4d94-ad46-cd682158fdf8-kube-api-access-mh2gw\") pod \"heat-operator-controller-manager-77b6666d85-hm824\" (UID: \"e6b9c8a5-3093-4d94-ad46-cd682158fdf8\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-hm824" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.795726 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czzwk\" (UniqueName: \"kubernetes.io/projected/c6c9dfcd-5298-468b-9de2-0280bf525b61-kube-api-access-czzwk\") pod \"designate-operator-controller-manager-66d56f6ff4-kc2zl\" (UID: \"c6c9dfcd-5298-468b-9de2-0280bf525b61\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-kc2zl" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.831209 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-8tg7w"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.832338 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-8tg7w" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.833119 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-kc2zl" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.835405 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-bftgg" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.846833 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbzkb\" (UniqueName: \"kubernetes.io/projected/90c1dec3-4daa-4ac6-b95e-209cb8bd9b55-kube-api-access-bbzkb\") pod \"mariadb-operator-controller-manager-658d4cdd5-rx4wh\" (UID: \"90c1dec3-4daa-4ac6-b95e-209cb8bd9b55\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rx4wh" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.847017 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwl2n\" (UniqueName: \"kubernetes.io/projected/bad08b57-dde0-496d-8ea1-5845a52d517a-kube-api-access-pwl2n\") pod \"horizon-operator-controller-manager-6d9d6b584d-grn97\" (UID: \"bad08b57-dde0-496d-8ea1-5845a52d517a\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-grn97" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.847105 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zppfn\" (UniqueName: \"kubernetes.io/projected/4a76403b-081b-4222-a707-4cd00dd440a0-kube-api-access-zppfn\") pod \"keystone-operator-controller-manager-684f77d66d-56nwc\" (UID: \"4a76403b-081b-4222-a707-4cd00dd440a0\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-56nwc" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.847713 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.849956 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npxfq\" (UniqueName: \"kubernetes.io/projected/44d006b0-b13e-49ce-8ff8-592f3d8798c1-kube-api-access-npxfq\") pod \"manila-operator-controller-manager-68f45f9d9f-9gcr6\" (UID: \"44d006b0-b13e-49ce-8ff8-592f3d8798c1\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9gcr6" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.850066 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5787w\" (UniqueName: \"kubernetes.io/projected/f34e0b2d-5c3c-4725-ae0c-760bf98e90d3-kube-api-access-5787w\") pod \"nova-operator-controller-manager-569cc54c5-chf9m\" (UID: \"f34e0b2d-5c3c-4725-ae0c-760bf98e90d3\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-chf9m" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.850122 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tctn5\" (UniqueName: \"kubernetes.io/projected/98cec4ba-d672-4627-8d37-46a0684fc284-kube-api-access-tctn5\") pod \"ironic-operator-controller-manager-6bbb499bbc-gb4dz\" (UID: \"98cec4ba-d672-4627-8d37-46a0684fc284\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-gb4dz" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.850179 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svwkt\" (UniqueName: \"kubernetes.io/projected/8b77ae90-8ef1-4e98-9d32-319dfdd55a6d-kube-api-access-svwkt\") pod \"neutron-operator-controller-manager-776c5696bf-8f2sj\" (UID: \"8b77ae90-8ef1-4e98-9d32-319dfdd55a6d\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-8f2sj" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.851835 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.860102 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7lhjg" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.862214 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.868639 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-729ml" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.879002 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tctn5\" (UniqueName: \"kubernetes.io/projected/98cec4ba-d672-4627-8d37-46a0684fc284-kube-api-access-tctn5\") pod \"ironic-operator-controller-manager-6bbb499bbc-gb4dz\" (UID: \"98cec4ba-d672-4627-8d37-46a0684fc284\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-gb4dz" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.896939 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwl2n\" (UniqueName: \"kubernetes.io/projected/bad08b57-dde0-496d-8ea1-5845a52d517a-kube-api-access-pwl2n\") pod \"horizon-operator-controller-manager-6d9d6b584d-grn97\" (UID: \"bad08b57-dde0-496d-8ea1-5845a52d517a\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-grn97" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.897005 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-chf9m"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.898240 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbzkb\" (UniqueName: \"kubernetes.io/projected/90c1dec3-4daa-4ac6-b95e-209cb8bd9b55-kube-api-access-bbzkb\") pod \"mariadb-operator-controller-manager-658d4cdd5-rx4wh\" (UID: \"90c1dec3-4daa-4ac6-b95e-209cb8bd9b55\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rx4wh" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.901117 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-j7v7h"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.901226 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npxfq\" (UniqueName: \"kubernetes.io/projected/44d006b0-b13e-49ce-8ff8-592f3d8798c1-kube-api-access-npxfq\") pod \"manila-operator-controller-manager-68f45f9d9f-9gcr6\" (UID: \"44d006b0-b13e-49ce-8ff8-592f3d8798c1\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9gcr6" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.901866 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-j7v7h" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.904898 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zppfn\" (UniqueName: \"kubernetes.io/projected/4a76403b-081b-4222-a707-4cd00dd440a0-kube-api-access-zppfn\") pod \"keystone-operator-controller-manager-684f77d66d-56nwc\" (UID: \"4a76403b-081b-4222-a707-4cd00dd440a0\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-56nwc" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.905166 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-hm824" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.909183 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-z8kzp" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.919377 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-8tg7w"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.919473 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-grn97" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.939434 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-gb4dz" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.957957 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.960022 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06f0c42d-1674-4913-8a86-1d1749d8d601-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7lccpq\" (UID: \"06f0c42d-1674-4913-8a86-1d1749d8d601\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.960061 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n4tt\" (UniqueName: \"kubernetes.io/projected/06f0c42d-1674-4913-8a86-1d1749d8d601-kube-api-access-8n4tt\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7lccpq\" (UID: \"06f0c42d-1674-4913-8a86-1d1749d8d601\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.960095 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qfgn\" (UniqueName: \"kubernetes.io/projected/b877a309-e752-4f24-90cd-6901973263e3-kube-api-access-5qfgn\") pod \"octavia-operator-controller-manager-5f4f55cb5c-8tg7w\" (UID: \"b877a309-e752-4f24-90cd-6901973263e3\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-8tg7w" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.960121 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5787w\" (UniqueName: \"kubernetes.io/projected/f34e0b2d-5c3c-4725-ae0c-760bf98e90d3-kube-api-access-5787w\") pod \"nova-operator-controller-manager-569cc54c5-chf9m\" (UID: \"f34e0b2d-5c3c-4725-ae0c-760bf98e90d3\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-chf9m" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.960161 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svwkt\" (UniqueName: \"kubernetes.io/projected/8b77ae90-8ef1-4e98-9d32-319dfdd55a6d-kube-api-access-svwkt\") pod \"neutron-operator-controller-manager-776c5696bf-8f2sj\" (UID: \"8b77ae90-8ef1-4e98-9d32-319dfdd55a6d\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-8f2sj" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.972445 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-j7v7h"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.980461 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svwkt\" (UniqueName: \"kubernetes.io/projected/8b77ae90-8ef1-4e98-9d32-319dfdd55a6d-kube-api-access-svwkt\") pod \"neutron-operator-controller-manager-776c5696bf-8f2sj\" (UID: \"8b77ae90-8ef1-4e98-9d32-319dfdd55a6d\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-8f2sj" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.982492 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5787w\" (UniqueName: \"kubernetes.io/projected/f34e0b2d-5c3c-4725-ae0c-760bf98e90d3-kube-api-access-5787w\") pod \"nova-operator-controller-manager-569cc54c5-chf9m\" (UID: \"f34e0b2d-5c3c-4725-ae0c-760bf98e90d3\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-chf9m" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.987725 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-bwvx6"] Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.988708 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-bwvx6" Mar 13 09:28:56 crc kubenswrapper[4841]: I0313 09:28:56.990762 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-4nm9z" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.018109 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-56nwc" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.028640 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-r42dd"] Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.029467 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r42dd" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.034280 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-tgxzf" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.043339 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-bwvx6"] Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.043820 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9gcr6" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.061092 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qfgn\" (UniqueName: \"kubernetes.io/projected/b877a309-e752-4f24-90cd-6901973263e3-kube-api-access-5qfgn\") pod \"octavia-operator-controller-manager-5f4f55cb5c-8tg7w\" (UID: \"b877a309-e752-4f24-90cd-6901973263e3\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-8tg7w" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.061176 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thr8d\" (UniqueName: \"kubernetes.io/projected/3a7d5a0b-0bd7-4735-b182-8a78870050cf-kube-api-access-thr8d\") pod \"ovn-operator-controller-manager-bbc5b68f9-j7v7h\" (UID: \"3a7d5a0b-0bd7-4735-b182-8a78870050cf\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-j7v7h" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.061223 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82jj2\" (UniqueName: \"kubernetes.io/projected/49a6b3bc-4db3-4006-b033-cc9cfa0cb5fc-kube-api-access-82jj2\") pod \"placement-operator-controller-manager-574d45c66c-bwvx6\" (UID: \"49a6b3bc-4db3-4006-b033-cc9cfa0cb5fc\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-bwvx6" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.061240 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06f0c42d-1674-4913-8a86-1d1749d8d601-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7lccpq\" (UID: \"06f0c42d-1674-4913-8a86-1d1749d8d601\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.061260 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n4tt\" (UniqueName: \"kubernetes.io/projected/06f0c42d-1674-4913-8a86-1d1749d8d601-kube-api-access-8n4tt\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7lccpq\" (UID: \"06f0c42d-1674-4913-8a86-1d1749d8d601\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" Mar 13 09:28:57 crc kubenswrapper[4841]: E0313 09:28:57.061727 4841 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 09:28:57 crc kubenswrapper[4841]: E0313 09:28:57.061762 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f0c42d-1674-4913-8a86-1d1749d8d601-cert podName:06f0c42d-1674-4913-8a86-1d1749d8d601 nodeName:}" failed. No retries permitted until 2026-03-13 09:28:57.561749967 +0000 UTC m=+1020.291650158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06f0c42d-1674-4913-8a86-1d1749d8d601-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" (UID: "06f0c42d-1674-4913-8a86-1d1749d8d601") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.065621 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-r42dd"] Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.065918 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rx4wh" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.083175 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qfgn\" (UniqueName: \"kubernetes.io/projected/b877a309-e752-4f24-90cd-6901973263e3-kube-api-access-5qfgn\") pod \"octavia-operator-controller-manager-5f4f55cb5c-8tg7w\" (UID: \"b877a309-e752-4f24-90cd-6901973263e3\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-8tg7w" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.088840 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n4tt\" (UniqueName: \"kubernetes.io/projected/06f0c42d-1674-4913-8a86-1d1749d8d601-kube-api-access-8n4tt\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7lccpq\" (UID: \"06f0c42d-1674-4913-8a86-1d1749d8d601\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.115494 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-8485bdb9db-mf5lp"] Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.116424 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-8485bdb9db-mf5lp" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.120671 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-8485bdb9db-mf5lp"] Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.122176 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-6csj7" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.147800 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-8f2sj" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.162627 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thr8d\" (UniqueName: \"kubernetes.io/projected/3a7d5a0b-0bd7-4735-b182-8a78870050cf-kube-api-access-thr8d\") pod \"ovn-operator-controller-manager-bbc5b68f9-j7v7h\" (UID: \"3a7d5a0b-0bd7-4735-b182-8a78870050cf\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-j7v7h" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.162946 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96dh2\" (UniqueName: \"kubernetes.io/projected/10b15182-cc2b-420b-9fc2-fe3ca6ea38d7-kube-api-access-96dh2\") pod \"swift-operator-controller-manager-677c674df7-r42dd\" (UID: \"10b15182-cc2b-420b-9fc2-fe3ca6ea38d7\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-r42dd" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.162998 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82jj2\" (UniqueName: \"kubernetes.io/projected/49a6b3bc-4db3-4006-b033-cc9cfa0cb5fc-kube-api-access-82jj2\") pod \"placement-operator-controller-manager-574d45c66c-bwvx6\" (UID: \"49a6b3bc-4db3-4006-b033-cc9cfa0cb5fc\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-bwvx6" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.169991 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-chf9m" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.171199 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8v6pq"] Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.172236 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8v6pq" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.174827 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-8tg7w" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.184379 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-v9486" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.191417 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8v6pq"] Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.200662 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82jj2\" (UniqueName: \"kubernetes.io/projected/49a6b3bc-4db3-4006-b033-cc9cfa0cb5fc-kube-api-access-82jj2\") pod \"placement-operator-controller-manager-574d45c66c-bwvx6\" (UID: \"49a6b3bc-4db3-4006-b033-cc9cfa0cb5fc\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-bwvx6" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.200773 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thr8d\" (UniqueName: \"kubernetes.io/projected/3a7d5a0b-0bd7-4735-b182-8a78870050cf-kube-api-access-thr8d\") pod \"ovn-operator-controller-manager-bbc5b68f9-j7v7h\" (UID: \"3a7d5a0b-0bd7-4735-b182-8a78870050cf\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-j7v7h" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.216427 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-pvtvx"] Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.217813 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-pvtvx" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.224030 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-v8b5g" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.240991 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-pvtvx"] Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.254410 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-j7v7h" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.264455 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpmvq\" (UniqueName: \"kubernetes.io/projected/35cf5dc3-b1c0-4481-8f0f-8bca19ecadd1-kube-api-access-kpmvq\") pod \"test-operator-controller-manager-5c5cb9c4d7-8v6pq\" (UID: \"35cf5dc3-b1c0-4481-8f0f-8bca19ecadd1\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8v6pq" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.264505 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sr7f\" (UniqueName: \"kubernetes.io/projected/17824e5f-18b3-46c0-910a-56e5529e09c3-kube-api-access-8sr7f\") pod \"telemetry-operator-controller-manager-8485bdb9db-mf5lp\" (UID: \"17824e5f-18b3-46c0-910a-56e5529e09c3\") " pod="openstack-operators/telemetry-operator-controller-manager-8485bdb9db-mf5lp" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.264560 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96dh2\" (UniqueName: \"kubernetes.io/projected/10b15182-cc2b-420b-9fc2-fe3ca6ea38d7-kube-api-access-96dh2\") pod \"swift-operator-controller-manager-677c674df7-r42dd\" (UID: \"10b15182-cc2b-420b-9fc2-fe3ca6ea38d7\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-r42dd" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.264595 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert\") pod \"infra-operator-controller-manager-5995f4446f-vtz8l\" (UID: \"26236923-39c0-4b46-be0d-61f453533891\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" Mar 13 09:28:57 crc kubenswrapper[4841]: E0313 09:28:57.264746 4841 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 09:28:57 crc kubenswrapper[4841]: E0313 09:28:57.264797 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert podName:26236923-39c0-4b46-be0d-61f453533891 nodeName:}" failed. No retries permitted until 2026-03-13 09:28:58.264781997 +0000 UTC m=+1020.994682188 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert") pod "infra-operator-controller-manager-5995f4446f-vtz8l" (UID: "26236923-39c0-4b46-be0d-61f453533891") : secret "infra-operator-webhook-server-cert" not found Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.279346 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-fkfjh"] Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.292944 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96dh2\" (UniqueName: \"kubernetes.io/projected/10b15182-cc2b-420b-9fc2-fe3ca6ea38d7-kube-api-access-96dh2\") pod \"swift-operator-controller-manager-677c674df7-r42dd\" (UID: \"10b15182-cc2b-420b-9fc2-fe3ca6ea38d7\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-r42dd" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.297370 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw"] Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.298146 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.299764 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.300065 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.300316 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-7jsqx" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.310335 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw"] Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.328693 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-bwvx6" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.349030 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nd9d9"] Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.350113 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nd9d9" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.354885 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-bv2tt" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.366424 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.366559 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.366587 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh56c\" (UniqueName: \"kubernetes.io/projected/2c86df2d-15dc-45f2-aca7-4200fdf36a53-kube-api-access-nh56c\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.366619 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpmvq\" (UniqueName: \"kubernetes.io/projected/35cf5dc3-b1c0-4481-8f0f-8bca19ecadd1-kube-api-access-kpmvq\") pod \"test-operator-controller-manager-5c5cb9c4d7-8v6pq\" (UID: \"35cf5dc3-b1c0-4481-8f0f-8bca19ecadd1\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8v6pq" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.366653 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sr7f\" (UniqueName: \"kubernetes.io/projected/17824e5f-18b3-46c0-910a-56e5529e09c3-kube-api-access-8sr7f\") pod \"telemetry-operator-controller-manager-8485bdb9db-mf5lp\" (UID: \"17824e5f-18b3-46c0-910a-56e5529e09c3\") " pod="openstack-operators/telemetry-operator-controller-manager-8485bdb9db-mf5lp" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.366690 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbmqs\" (UniqueName: \"kubernetes.io/projected/f5ae27e8-47b9-437c-9506-f51da1b6c9f8-kube-api-access-hbmqs\") pod \"watcher-operator-controller-manager-6dd88c6f67-pvtvx\" (UID: \"f5ae27e8-47b9-437c-9506-f51da1b6c9f8\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-pvtvx" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.384996 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nd9d9"] Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.391119 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpmvq\" (UniqueName: \"kubernetes.io/projected/35cf5dc3-b1c0-4481-8f0f-8bca19ecadd1-kube-api-access-kpmvq\") pod \"test-operator-controller-manager-5c5cb9c4d7-8v6pq\" (UID: \"35cf5dc3-b1c0-4481-8f0f-8bca19ecadd1\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8v6pq" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.420873 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sr7f\" (UniqueName: \"kubernetes.io/projected/17824e5f-18b3-46c0-910a-56e5529e09c3-kube-api-access-8sr7f\") pod \"telemetry-operator-controller-manager-8485bdb9db-mf5lp\" (UID: \"17824e5f-18b3-46c0-910a-56e5529e09c3\") " pod="openstack-operators/telemetry-operator-controller-manager-8485bdb9db-mf5lp" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.472638 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.472688 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh56c\" (UniqueName: \"kubernetes.io/projected/2c86df2d-15dc-45f2-aca7-4200fdf36a53-kube-api-access-nh56c\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.472769 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbmqs\" (UniqueName: \"kubernetes.io/projected/f5ae27e8-47b9-437c-9506-f51da1b6c9f8-kube-api-access-hbmqs\") pod \"watcher-operator-controller-manager-6dd88c6f67-pvtvx\" (UID: \"f5ae27e8-47b9-437c-9506-f51da1b6c9f8\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-pvtvx" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.472801 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qstgf\" (UniqueName: \"kubernetes.io/projected/daeb73dc-4973-4a0b-906d-4afc7f61717c-kube-api-access-qstgf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-nd9d9\" (UID: \"daeb73dc-4973-4a0b-906d-4afc7f61717c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nd9d9" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.472899 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:28:57 crc kubenswrapper[4841]: E0313 09:28:57.473091 4841 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 09:28:57 crc kubenswrapper[4841]: E0313 09:28:57.473148 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs podName:2c86df2d-15dc-45f2-aca7-4200fdf36a53 nodeName:}" failed. No retries permitted until 2026-03-13 09:28:57.973130183 +0000 UTC m=+1020.703030374 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs") pod "openstack-operator-controller-manager-57ddc6f479-h7khw" (UID: "2c86df2d-15dc-45f2-aca7-4200fdf36a53") : secret "metrics-server-cert" not found Mar 13 09:28:57 crc kubenswrapper[4841]: E0313 09:28:57.473390 4841 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 09:28:57 crc kubenswrapper[4841]: E0313 09:28:57.473418 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs podName:2c86df2d-15dc-45f2-aca7-4200fdf36a53 nodeName:}" failed. No retries permitted until 2026-03-13 09:28:57.973409691 +0000 UTC m=+1020.703309882 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs") pod "openstack-operator-controller-manager-57ddc6f479-h7khw" (UID: "2c86df2d-15dc-45f2-aca7-4200fdf36a53") : secret "webhook-server-cert" not found Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.480912 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r42dd" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.496879 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh56c\" (UniqueName: \"kubernetes.io/projected/2c86df2d-15dc-45f2-aca7-4200fdf36a53-kube-api-access-nh56c\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.505416 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbmqs\" (UniqueName: \"kubernetes.io/projected/f5ae27e8-47b9-437c-9506-f51da1b6c9f8-kube-api-access-hbmqs\") pod \"watcher-operator-controller-manager-6dd88c6f67-pvtvx\" (UID: \"f5ae27e8-47b9-437c-9506-f51da1b6c9f8\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-pvtvx" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.517279 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-8485bdb9db-mf5lp" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.543697 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8v6pq" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.575148 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06f0c42d-1674-4913-8a86-1d1749d8d601-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7lccpq\" (UID: \"06f0c42d-1674-4913-8a86-1d1749d8d601\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.575243 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qstgf\" (UniqueName: \"kubernetes.io/projected/daeb73dc-4973-4a0b-906d-4afc7f61717c-kube-api-access-qstgf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-nd9d9\" (UID: \"daeb73dc-4973-4a0b-906d-4afc7f61717c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nd9d9" Mar 13 09:28:57 crc kubenswrapper[4841]: E0313 09:28:57.575623 4841 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 09:28:57 crc kubenswrapper[4841]: E0313 09:28:57.575660 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f0c42d-1674-4913-8a86-1d1749d8d601-cert podName:06f0c42d-1674-4913-8a86-1d1749d8d601 nodeName:}" failed. No retries permitted until 2026-03-13 09:28:58.575648154 +0000 UTC m=+1021.305548345 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06f0c42d-1674-4913-8a86-1d1749d8d601-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" (UID: "06f0c42d-1674-4913-8a86-1d1749d8d601") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.590115 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qstgf\" (UniqueName: \"kubernetes.io/projected/daeb73dc-4973-4a0b-906d-4afc7f61717c-kube-api-access-qstgf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-nd9d9\" (UID: \"daeb73dc-4973-4a0b-906d-4afc7f61717c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nd9d9" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.617437 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-cgkfz"] Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.624646 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-kc2zl"] Mar 13 09:28:57 crc kubenswrapper[4841]: W0313 09:28:57.632852 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6c9dfcd_5298_468b_9de2_0280bf525b61.slice/crio-9257badf23d7e6880b959bb09a581e54e66219dfb4be64977324899ae99d3ec9 WatchSource:0}: Error finding container 9257badf23d7e6880b959bb09a581e54e66219dfb4be64977324899ae99d3ec9: Status 404 returned error can't find the container with id 9257badf23d7e6880b959bb09a581e54e66219dfb4be64977324899ae99d3ec9 Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.633480 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-pvtvx" Mar 13 09:28:57 crc kubenswrapper[4841]: W0313 09:28:57.655239 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bd9dd2f_b4fd_4078_b463_4e970fa6791d.slice/crio-a543e38a209b0cf34486045422bdc3139d3b104698e8799cd9518a7bf4e4f83f WatchSource:0}: Error finding container a543e38a209b0cf34486045422bdc3139d3b104698e8799cd9518a7bf4e4f83f: Status 404 returned error can't find the container with id a543e38a209b0cf34486045422bdc3139d3b104698e8799cd9518a7bf4e4f83f Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.732606 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nd9d9" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.785235 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-7lhjg"] Mar 13 09:28:57 crc kubenswrapper[4841]: W0313 09:28:57.808564 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46ca7c55_bd68_4454_a014_85f81f1b5a60.slice/crio-fd703187e29ce421af840f9bc54dd49eb868bfc26fca1c16f3f6cf790891ae2a WatchSource:0}: Error finding container fd703187e29ce421af840f9bc54dd49eb868bfc26fca1c16f3f6cf790891ae2a: Status 404 returned error can't find the container with id fd703187e29ce421af840f9bc54dd49eb868bfc26fca1c16f3f6cf790891ae2a Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.822143 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-grn97"] Mar 13 09:28:57 crc kubenswrapper[4841]: W0313 09:28:57.827185 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98cec4ba_d672_4627_8d37_46a0684fc284.slice/crio-68c1d01abb730215fe2aaf67ffb15fc7e73f4aa71864a7ffba5a4ae4e1ad860b WatchSource:0}: Error finding container 68c1d01abb730215fe2aaf67ffb15fc7e73f4aa71864a7ffba5a4ae4e1ad860b: Status 404 returned error can't find the container with id 68c1d01abb730215fe2aaf67ffb15fc7e73f4aa71864a7ffba5a4ae4e1ad860b Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.838187 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-gb4dz"] Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.917937 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-hm824"] Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.935411 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-56nwc"] Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.941755 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rx4wh"] Mar 13 09:28:57 crc kubenswrapper[4841]: W0313 09:28:57.948592 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a76403b_081b_4222_a707_4cd00dd440a0.slice/crio-12aa44e5e0947d4f785e7ff01e1eb150451ffb95c615b50414572dcd74c6027b WatchSource:0}: Error finding container 12aa44e5e0947d4f785e7ff01e1eb150451ffb95c615b50414572dcd74c6027b: Status 404 returned error can't find the container with id 12aa44e5e0947d4f785e7ff01e1eb150451ffb95c615b50414572dcd74c6027b Mar 13 09:28:57 crc kubenswrapper[4841]: W0313 09:28:57.949435 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90c1dec3_4daa_4ac6_b95e_209cb8bd9b55.slice/crio-e8d6207a0d35bc015f3e5579a53ab8fc6e2cda38592250307f036c1254f8daed WatchSource:0}: Error finding container e8d6207a0d35bc015f3e5579a53ab8fc6e2cda38592250307f036c1254f8daed: Status 404 returned error can't find the container with id e8d6207a0d35bc015f3e5579a53ab8fc6e2cda38592250307f036c1254f8daed Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.979332 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:28:57 crc kubenswrapper[4841]: I0313 09:28:57.979510 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:28:57 crc kubenswrapper[4841]: E0313 09:28:57.979513 4841 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 09:28:57 crc kubenswrapper[4841]: E0313 09:28:57.979586 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs podName:2c86df2d-15dc-45f2-aca7-4200fdf36a53 nodeName:}" failed. No retries permitted until 2026-03-13 09:28:58.979564588 +0000 UTC m=+1021.709464879 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs") pod "openstack-operator-controller-manager-57ddc6f479-h7khw" (UID: "2c86df2d-15dc-45f2-aca7-4200fdf36a53") : secret "metrics-server-cert" not found Mar 13 09:28:57 crc kubenswrapper[4841]: E0313 09:28:57.979668 4841 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 09:28:57 crc kubenswrapper[4841]: E0313 09:28:57.979742 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs podName:2c86df2d-15dc-45f2-aca7-4200fdf36a53 nodeName:}" failed. No retries permitted until 2026-03-13 09:28:58.979724893 +0000 UTC m=+1021.709625084 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs") pod "openstack-operator-controller-manager-57ddc6f479-h7khw" (UID: "2c86df2d-15dc-45f2-aca7-4200fdf36a53") : secret "webhook-server-cert" not found Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.106843 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-bwvx6"] Mar 13 09:28:58 crc kubenswrapper[4841]: W0313 09:28:58.123436 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a7d5a0b_0bd7_4735_b182_8a78870050cf.slice/crio-1f4259e30ab6c9cba7bdf73cf1f7f347f5888cb5b9ec44fb385beac86edd7bb4 WatchSource:0}: Error finding container 1f4259e30ab6c9cba7bdf73cf1f7f347f5888cb5b9ec44fb385beac86edd7bb4: Status 404 returned error can't find the container with id 1f4259e30ab6c9cba7bdf73cf1f7f347f5888cb5b9ec44fb385beac86edd7bb4 Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.132664 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-j7v7h"] Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.193746 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-kc2zl" event={"ID":"c6c9dfcd-5298-468b-9de2-0280bf525b61","Type":"ContainerStarted","Data":"9257badf23d7e6880b959bb09a581e54e66219dfb4be64977324899ae99d3ec9"} Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.194704 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-j7v7h" event={"ID":"3a7d5a0b-0bd7-4735-b182-8a78870050cf","Type":"ContainerStarted","Data":"1f4259e30ab6c9cba7bdf73cf1f7f347f5888cb5b9ec44fb385beac86edd7bb4"} Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.195606 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-fkfjh" event={"ID":"9db8c27e-023c-4e28-a381-24f4438a6add","Type":"ContainerStarted","Data":"5473c6cfcd8789bc3b308598464961c6ee4936bd1b38da8a450b92ba8849961a"} Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.196821 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-grn97" event={"ID":"bad08b57-dde0-496d-8ea1-5845a52d517a","Type":"ContainerStarted","Data":"a2dc11ef0c5bec185291660786d3fc7c7b7c8902059f7f7dec22ccf7355ec637"} Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.197943 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rx4wh" event={"ID":"90c1dec3-4daa-4ac6-b95e-209cb8bd9b55","Type":"ContainerStarted","Data":"e8d6207a0d35bc015f3e5579a53ab8fc6e2cda38592250307f036c1254f8daed"} Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.199017 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-cgkfz" event={"ID":"7bd9dd2f-b4fd-4078-b463-4e970fa6791d","Type":"ContainerStarted","Data":"a543e38a209b0cf34486045422bdc3139d3b104698e8799cd9518a7bf4e4f83f"} Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.200349 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7lhjg" event={"ID":"46ca7c55-bd68-4454-a014-85f81f1b5a60","Type":"ContainerStarted","Data":"fd703187e29ce421af840f9bc54dd49eb868bfc26fca1c16f3f6cf790891ae2a"} Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.201223 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-hm824" event={"ID":"e6b9c8a5-3093-4d94-ad46-cd682158fdf8","Type":"ContainerStarted","Data":"c7ce08b7a991be6725bd6e86b997cb9104a3ea5c4c0d3598f3df46ef7f30f12e"} Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.203011 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-56nwc" event={"ID":"4a76403b-081b-4222-a707-4cd00dd440a0","Type":"ContainerStarted","Data":"12aa44e5e0947d4f785e7ff01e1eb150451ffb95c615b50414572dcd74c6027b"} Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.206115 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-gb4dz" event={"ID":"98cec4ba-d672-4627-8d37-46a0684fc284","Type":"ContainerStarted","Data":"68c1d01abb730215fe2aaf67ffb15fc7e73f4aa71864a7ffba5a4ae4e1ad860b"} Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.207309 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-bwvx6" event={"ID":"49a6b3bc-4db3-4006-b033-cc9cfa0cb5fc","Type":"ContainerStarted","Data":"0d9e1d441374092e44929847af50023dfcaac268ce388794a00f866ef67d617c"} Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.292083 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert\") pod \"infra-operator-controller-manager-5995f4446f-vtz8l\" (UID: \"26236923-39c0-4b46-be0d-61f453533891\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" Mar 13 09:28:58 crc kubenswrapper[4841]: E0313 09:28:58.292496 4841 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 09:28:58 crc kubenswrapper[4841]: E0313 09:28:58.292561 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert podName:26236923-39c0-4b46-be0d-61f453533891 nodeName:}" failed. No retries permitted until 2026-03-13 09:29:00.292539469 +0000 UTC m=+1023.022439660 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert") pod "infra-operator-controller-manager-5995f4446f-vtz8l" (UID: "26236923-39c0-4b46-be0d-61f453533891") : secret "infra-operator-webhook-server-cert" not found Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.294374 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-chf9m"] Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.299637 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-8tg7w"] Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.314923 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-r42dd"] Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.345353 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-8f2sj"] Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.345648 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-9gcr6"] Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.353927 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8v6pq"] Mar 13 09:28:58 crc kubenswrapper[4841]: W0313 09:28:58.359099 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb877a309_e752_4f24_90cd_6901973263e3.slice/crio-4434441e172e149c665bdacf6b1d4e19e7d117f0671f5ccf13c155a9d8040032 WatchSource:0}: Error finding container 4434441e172e149c665bdacf6b1d4e19e7d117f0671f5ccf13c155a9d8040032: Status 404 returned error can't find the container with id 4434441e172e149c665bdacf6b1d4e19e7d117f0671f5ccf13c155a9d8040032 Mar 13 09:28:58 crc kubenswrapper[4841]: E0313 09:28:58.372322 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.69:5001/openstack-k8s-operators/telemetry-operator:1a1a9a719889b8cdda26cbd675f0005643a8f9f2,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8sr7f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-8485bdb9db-mf5lp_openstack-operators(17824e5f-18b3-46c0-910a-56e5529e09c3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 09:28:58 crc kubenswrapper[4841]: E0313 09:28:58.374290 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-8485bdb9db-mf5lp" podUID="17824e5f-18b3-46c0-910a-56e5529e09c3" Mar 13 09:28:58 crc kubenswrapper[4841]: E0313 09:28:58.375073 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-npxfq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-68f45f9d9f-9gcr6_openstack-operators(44d006b0-b13e-49ce-8ff8-592f3d8798c1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 09:28:58 crc kubenswrapper[4841]: E0313 09:28:58.375745 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kpmvq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-8v6pq_openstack-operators(35cf5dc3-b1c0-4481-8f0f-8bca19ecadd1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 09:28:58 crc kubenswrapper[4841]: E0313 09:28:58.376791 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5qfgn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5f4f55cb5c-8tg7w_openstack-operators(b877a309-e752-4f24-90cd-6901973263e3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 09:28:58 crc kubenswrapper[4841]: E0313 09:28:58.376882 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qstgf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-nd9d9_openstack-operators(daeb73dc-4973-4a0b-906d-4afc7f61717c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 09:28:58 crc kubenswrapper[4841]: E0313 09:28:58.376929 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8v6pq" podUID="35cf5dc3-b1c0-4481-8f0f-8bca19ecadd1" Mar 13 09:28:58 crc kubenswrapper[4841]: E0313 09:28:58.377756 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9gcr6" podUID="44d006b0-b13e-49ce-8ff8-592f3d8798c1" Mar 13 09:28:58 crc kubenswrapper[4841]: E0313 09:28:58.378514 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nd9d9" podUID="daeb73dc-4973-4a0b-906d-4afc7f61717c" Mar 13 09:28:58 crc kubenswrapper[4841]: E0313 09:28:58.379153 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-8tg7w" podUID="b877a309-e752-4f24-90cd-6901973263e3" Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.380469 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-8485bdb9db-mf5lp"] Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.386641 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nd9d9"] Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.391426 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-pvtvx"] Mar 13 09:28:58 crc kubenswrapper[4841]: W0313 09:28:58.403803 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5ae27e8_47b9_437c_9506_f51da1b6c9f8.slice/crio-56fe89fbf8afe46520af1c144b0747ab54e378be8fd7303823c46312ffe7ad11 WatchSource:0}: Error finding container 56fe89fbf8afe46520af1c144b0747ab54e378be8fd7303823c46312ffe7ad11: Status 404 returned error can't find the container with id 56fe89fbf8afe46520af1c144b0747ab54e378be8fd7303823c46312ffe7ad11 Mar 13 09:28:58 crc kubenswrapper[4841]: E0313 09:28:58.407000 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hbmqs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6dd88c6f67-pvtvx_openstack-operators(f5ae27e8-47b9-437c-9506-f51da1b6c9f8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 09:28:58 crc kubenswrapper[4841]: E0313 09:28:58.408235 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-pvtvx" podUID="f5ae27e8-47b9-437c-9506-f51da1b6c9f8" Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.600024 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06f0c42d-1674-4913-8a86-1d1749d8d601-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7lccpq\" (UID: \"06f0c42d-1674-4913-8a86-1d1749d8d601\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" Mar 13 09:28:58 crc kubenswrapper[4841]: E0313 09:28:58.600257 4841 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 09:28:58 crc kubenswrapper[4841]: E0313 09:28:58.600390 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f0c42d-1674-4913-8a86-1d1749d8d601-cert podName:06f0c42d-1674-4913-8a86-1d1749d8d601 nodeName:}" failed. No retries permitted until 2026-03-13 09:29:00.600365211 +0000 UTC m=+1023.330265402 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06f0c42d-1674-4913-8a86-1d1749d8d601-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" (UID: "06f0c42d-1674-4913-8a86-1d1749d8d601") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 09:28:58 crc kubenswrapper[4841]: I0313 09:28:58.805466 4841 scope.go:117] "RemoveContainer" containerID="74280dc591c2a72619e3059ba0d7af8ad907b66aa9b472cf70475f24aee947db" Mar 13 09:28:59 crc kubenswrapper[4841]: I0313 09:28:59.006899 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:28:59 crc kubenswrapper[4841]: I0313 09:28:59.007020 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:28:59 crc kubenswrapper[4841]: E0313 09:28:59.007117 4841 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 09:28:59 crc kubenswrapper[4841]: E0313 09:28:59.007182 4841 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 09:28:59 crc kubenswrapper[4841]: E0313 09:28:59.007195 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs podName:2c86df2d-15dc-45f2-aca7-4200fdf36a53 nodeName:}" failed. No retries permitted until 2026-03-13 09:29:01.007177825 +0000 UTC m=+1023.737078016 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs") pod "openstack-operator-controller-manager-57ddc6f479-h7khw" (UID: "2c86df2d-15dc-45f2-aca7-4200fdf36a53") : secret "metrics-server-cert" not found Mar 13 09:28:59 crc kubenswrapper[4841]: E0313 09:28:59.007302 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs podName:2c86df2d-15dc-45f2-aca7-4200fdf36a53 nodeName:}" failed. No retries permitted until 2026-03-13 09:29:01.007243947 +0000 UTC m=+1023.737144198 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs") pod "openstack-operator-controller-manager-57ddc6f479-h7khw" (UID: "2c86df2d-15dc-45f2-aca7-4200fdf36a53") : secret "webhook-server-cert" not found Mar 13 09:28:59 crc kubenswrapper[4841]: I0313 09:28:59.221741 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-8tg7w" event={"ID":"b877a309-e752-4f24-90cd-6901973263e3","Type":"ContainerStarted","Data":"4434441e172e149c665bdacf6b1d4e19e7d117f0671f5ccf13c155a9d8040032"} Mar 13 09:28:59 crc kubenswrapper[4841]: E0313 09:28:59.223485 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-8tg7w" podUID="b877a309-e752-4f24-90cd-6901973263e3" Mar 13 09:28:59 crc kubenswrapper[4841]: I0313 09:28:59.234685 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-chf9m" event={"ID":"f34e0b2d-5c3c-4725-ae0c-760bf98e90d3","Type":"ContainerStarted","Data":"cb5628ac1167d8e699bc6bc65da01f9d519557cbc265df211a6480512c8f7b2b"} Mar 13 09:28:59 crc kubenswrapper[4841]: I0313 09:28:59.236713 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nd9d9" event={"ID":"daeb73dc-4973-4a0b-906d-4afc7f61717c","Type":"ContainerStarted","Data":"12c0afdb1ab948ad1e2cf65652dee30c894d4c9caeaa6251cf57e53d2b103ff7"} Mar 13 09:28:59 crc kubenswrapper[4841]: E0313 09:28:59.237900 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nd9d9" podUID="daeb73dc-4973-4a0b-906d-4afc7f61717c" Mar 13 09:28:59 crc kubenswrapper[4841]: I0313 09:28:59.247669 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9gcr6" event={"ID":"44d006b0-b13e-49ce-8ff8-592f3d8798c1","Type":"ContainerStarted","Data":"98ad560b12d995a71b74df29675b1dd036ca0750ff613c15b9a167c334b979e0"} Mar 13 09:28:59 crc kubenswrapper[4841]: E0313 09:28:59.259779 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4\\\"\"" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9gcr6" podUID="44d006b0-b13e-49ce-8ff8-592f3d8798c1" Mar 13 09:28:59 crc kubenswrapper[4841]: I0313 09:28:59.262510 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-8f2sj" event={"ID":"8b77ae90-8ef1-4e98-9d32-319dfdd55a6d","Type":"ContainerStarted","Data":"fef103fcdd9c065ba0269ea59e0db15229365b89de6a3638d32b77d6e5523e30"} Mar 13 09:28:59 crc kubenswrapper[4841]: I0313 09:28:59.275943 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-8485bdb9db-mf5lp" event={"ID":"17824e5f-18b3-46c0-910a-56e5529e09c3","Type":"ContainerStarted","Data":"bde83c0f84e1c7035eb2fbe178277fb32f5ca15596099fe91038b0fe39768e8c"} Mar 13 09:28:59 crc kubenswrapper[4841]: E0313 09:28:59.280589 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.69:5001/openstack-k8s-operators/telemetry-operator:1a1a9a719889b8cdda26cbd675f0005643a8f9f2\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-8485bdb9db-mf5lp" podUID="17824e5f-18b3-46c0-910a-56e5529e09c3" Mar 13 09:28:59 crc kubenswrapper[4841]: I0313 09:28:59.281974 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8v6pq" event={"ID":"35cf5dc3-b1c0-4481-8f0f-8bca19ecadd1","Type":"ContainerStarted","Data":"bb8a7d86db2be90b78c340383df287924374a80499b373836181ff181e6beed9"} Mar 13 09:28:59 crc kubenswrapper[4841]: E0313 09:28:59.284763 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8v6pq" podUID="35cf5dc3-b1c0-4481-8f0f-8bca19ecadd1" Mar 13 09:28:59 crc kubenswrapper[4841]: I0313 09:28:59.287287 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r42dd" event={"ID":"10b15182-cc2b-420b-9fc2-fe3ca6ea38d7","Type":"ContainerStarted","Data":"430cfe212ed806f1025495495515642e6dbc931227b15d98ec9dd0d23730d275"} Mar 13 09:28:59 crc kubenswrapper[4841]: I0313 09:28:59.289772 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-pvtvx" event={"ID":"f5ae27e8-47b9-437c-9506-f51da1b6c9f8","Type":"ContainerStarted","Data":"56fe89fbf8afe46520af1c144b0747ab54e378be8fd7303823c46312ffe7ad11"} Mar 13 09:28:59 crc kubenswrapper[4841]: E0313 09:28:59.294862 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-pvtvx" podUID="f5ae27e8-47b9-437c-9506-f51da1b6c9f8" Mar 13 09:29:00 crc kubenswrapper[4841]: E0313 09:29:00.307670 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nd9d9" podUID="daeb73dc-4973-4a0b-906d-4afc7f61717c" Mar 13 09:29:00 crc kubenswrapper[4841]: E0313 09:29:00.307923 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-8tg7w" podUID="b877a309-e752-4f24-90cd-6901973263e3" Mar 13 09:29:00 crc kubenswrapper[4841]: E0313 09:29:00.308034 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4\\\"\"" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9gcr6" podUID="44d006b0-b13e-49ce-8ff8-592f3d8798c1" Mar 13 09:29:00 crc kubenswrapper[4841]: E0313 09:29:00.308097 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-pvtvx" podUID="f5ae27e8-47b9-437c-9506-f51da1b6c9f8" Mar 13 09:29:00 crc kubenswrapper[4841]: E0313 09:29:00.308172 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8v6pq" podUID="35cf5dc3-b1c0-4481-8f0f-8bca19ecadd1" Mar 13 09:29:00 crc kubenswrapper[4841]: E0313 09:29:00.313058 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.69:5001/openstack-k8s-operators/telemetry-operator:1a1a9a719889b8cdda26cbd675f0005643a8f9f2\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-8485bdb9db-mf5lp" podUID="17824e5f-18b3-46c0-910a-56e5529e09c3" Mar 13 09:29:00 crc kubenswrapper[4841]: I0313 09:29:00.329363 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert\") pod \"infra-operator-controller-manager-5995f4446f-vtz8l\" (UID: \"26236923-39c0-4b46-be0d-61f453533891\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" Mar 13 09:29:00 crc kubenswrapper[4841]: E0313 09:29:00.329544 4841 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 09:29:00 crc kubenswrapper[4841]: E0313 09:29:00.329595 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert podName:26236923-39c0-4b46-be0d-61f453533891 nodeName:}" failed. No retries permitted until 2026-03-13 09:29:04.329578589 +0000 UTC m=+1027.059478780 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert") pod "infra-operator-controller-manager-5995f4446f-vtz8l" (UID: "26236923-39c0-4b46-be0d-61f453533891") : secret "infra-operator-webhook-server-cert" not found Mar 13 09:29:00 crc kubenswrapper[4841]: I0313 09:29:00.633775 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06f0c42d-1674-4913-8a86-1d1749d8d601-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7lccpq\" (UID: \"06f0c42d-1674-4913-8a86-1d1749d8d601\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" Mar 13 09:29:00 crc kubenswrapper[4841]: E0313 09:29:00.633980 4841 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 09:29:00 crc kubenswrapper[4841]: E0313 09:29:00.634064 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f0c42d-1674-4913-8a86-1d1749d8d601-cert podName:06f0c42d-1674-4913-8a86-1d1749d8d601 nodeName:}" failed. No retries permitted until 2026-03-13 09:29:04.634042967 +0000 UTC m=+1027.363943158 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06f0c42d-1674-4913-8a86-1d1749d8d601-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" (UID: "06f0c42d-1674-4913-8a86-1d1749d8d601") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 09:29:01 crc kubenswrapper[4841]: I0313 09:29:01.040435 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:29:01 crc kubenswrapper[4841]: E0313 09:29:01.040617 4841 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 09:29:01 crc kubenswrapper[4841]: I0313 09:29:01.040671 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:29:01 crc kubenswrapper[4841]: E0313 09:29:01.040692 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs podName:2c86df2d-15dc-45f2-aca7-4200fdf36a53 nodeName:}" failed. No retries permitted until 2026-03-13 09:29:05.040671495 +0000 UTC m=+1027.770571686 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs") pod "openstack-operator-controller-manager-57ddc6f479-h7khw" (UID: "2c86df2d-15dc-45f2-aca7-4200fdf36a53") : secret "webhook-server-cert" not found Mar 13 09:29:01 crc kubenswrapper[4841]: E0313 09:29:01.041497 4841 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 09:29:01 crc kubenswrapper[4841]: E0313 09:29:01.041574 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs podName:2c86df2d-15dc-45f2-aca7-4200fdf36a53 nodeName:}" failed. No retries permitted until 2026-03-13 09:29:05.041563913 +0000 UTC m=+1027.771464104 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs") pod "openstack-operator-controller-manager-57ddc6f479-h7khw" (UID: "2c86df2d-15dc-45f2-aca7-4200fdf36a53") : secret "metrics-server-cert" not found Mar 13 09:29:04 crc kubenswrapper[4841]: I0313 09:29:04.394626 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert\") pod \"infra-operator-controller-manager-5995f4446f-vtz8l\" (UID: \"26236923-39c0-4b46-be0d-61f453533891\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" Mar 13 09:29:04 crc kubenswrapper[4841]: E0313 09:29:04.395139 4841 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 09:29:04 crc kubenswrapper[4841]: E0313 09:29:04.395578 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert podName:26236923-39c0-4b46-be0d-61f453533891 nodeName:}" failed. No retries permitted until 2026-03-13 09:29:12.39555857 +0000 UTC m=+1035.125458761 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert") pod "infra-operator-controller-manager-5995f4446f-vtz8l" (UID: "26236923-39c0-4b46-be0d-61f453533891") : secret "infra-operator-webhook-server-cert" not found Mar 13 09:29:04 crc kubenswrapper[4841]: I0313 09:29:04.407203 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:29:04 crc kubenswrapper[4841]: I0313 09:29:04.407310 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:29:04 crc kubenswrapper[4841]: I0313 09:29:04.407376 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:29:04 crc kubenswrapper[4841]: I0313 09:29:04.408184 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6491cd972e8f18473231b5b2215720345c96ab2a0337886960a5d983df3b0e59"} pod="openshift-machine-config-operator/machine-config-daemon-h227v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 09:29:04 crc kubenswrapper[4841]: I0313 09:29:04.408349 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" containerID="cri-o://6491cd972e8f18473231b5b2215720345c96ab2a0337886960a5d983df3b0e59" gracePeriod=600 Mar 13 09:29:04 crc kubenswrapper[4841]: I0313 09:29:04.699618 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06f0c42d-1674-4913-8a86-1d1749d8d601-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7lccpq\" (UID: \"06f0c42d-1674-4913-8a86-1d1749d8d601\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" Mar 13 09:29:04 crc kubenswrapper[4841]: E0313 09:29:04.699760 4841 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 09:29:04 crc kubenswrapper[4841]: E0313 09:29:04.699811 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f0c42d-1674-4913-8a86-1d1749d8d601-cert podName:06f0c42d-1674-4913-8a86-1d1749d8d601 nodeName:}" failed. No retries permitted until 2026-03-13 09:29:12.69979628 +0000 UTC m=+1035.429696471 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06f0c42d-1674-4913-8a86-1d1749d8d601-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" (UID: "06f0c42d-1674-4913-8a86-1d1749d8d601") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 09:29:05 crc kubenswrapper[4841]: I0313 09:29:05.104889 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:29:05 crc kubenswrapper[4841]: E0313 09:29:05.105133 4841 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 09:29:05 crc kubenswrapper[4841]: I0313 09:29:05.105437 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:29:05 crc kubenswrapper[4841]: E0313 09:29:05.105486 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs podName:2c86df2d-15dc-45f2-aca7-4200fdf36a53 nodeName:}" failed. No retries permitted until 2026-03-13 09:29:13.105461818 +0000 UTC m=+1035.835362019 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs") pod "openstack-operator-controller-manager-57ddc6f479-h7khw" (UID: "2c86df2d-15dc-45f2-aca7-4200fdf36a53") : secret "webhook-server-cert" not found Mar 13 09:29:05 crc kubenswrapper[4841]: E0313 09:29:05.105588 4841 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 09:29:05 crc kubenswrapper[4841]: E0313 09:29:05.105666 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs podName:2c86df2d-15dc-45f2-aca7-4200fdf36a53 nodeName:}" failed. No retries permitted until 2026-03-13 09:29:13.105641523 +0000 UTC m=+1035.835541814 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs") pod "openstack-operator-controller-manager-57ddc6f479-h7khw" (UID: "2c86df2d-15dc-45f2-aca7-4200fdf36a53") : secret "metrics-server-cert" not found Mar 13 09:29:05 crc kubenswrapper[4841]: I0313 09:29:05.345103 4841 generic.go:334] "Generic (PLEG): container finished" podID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerID="6491cd972e8f18473231b5b2215720345c96ab2a0337886960a5d983df3b0e59" exitCode=0 Mar 13 09:29:05 crc kubenswrapper[4841]: I0313 09:29:05.345144 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerDied","Data":"6491cd972e8f18473231b5b2215720345c96ab2a0337886960a5d983df3b0e59"} Mar 13 09:29:05 crc kubenswrapper[4841]: I0313 09:29:05.345175 4841 scope.go:117] "RemoveContainer" containerID="5ed3bccb1da12fcd7dcfabd48b6eee04f275c5f16e821a0e4d8dce433f764913" Mar 13 09:29:09 crc kubenswrapper[4841]: E0313 09:29:09.577094 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:6c9aef12f50be0b974f5e35b0d69303e7f7b95e6db5d41bcdb2d9d1100e921a6" Mar 13 09:29:09 crc kubenswrapper[4841]: E0313 09:29:09.577370 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:6c9aef12f50be0b974f5e35b0d69303e7f7b95e6db5d41bcdb2d9d1100e921a6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mh2gw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-77b6666d85-hm824_openstack-operators(e6b9c8a5-3093-4d94-ad46-cd682158fdf8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 09:29:09 crc kubenswrapper[4841]: E0313 09:29:09.578560 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-hm824" podUID="e6b9c8a5-3093-4d94-ad46-cd682158fdf8" Mar 13 09:29:10 crc kubenswrapper[4841]: E0313 09:29:10.379405 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:6c9aef12f50be0b974f5e35b0d69303e7f7b95e6db5d41bcdb2d9d1100e921a6\\\"\"" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-hm824" podUID="e6b9c8a5-3093-4d94-ad46-cd682158fdf8" Mar 13 09:29:11 crc kubenswrapper[4841]: E0313 09:29:11.161081 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c" Mar 13 09:29:11 crc kubenswrapper[4841]: E0313 09:29:11.161257 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-96dh2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-677c674df7-r42dd_openstack-operators(10b15182-cc2b-420b-9fc2-fe3ca6ea38d7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 09:29:11 crc kubenswrapper[4841]: E0313 09:29:11.162464 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r42dd" podUID="10b15182-cc2b-420b-9fc2-fe3ca6ea38d7" Mar 13 09:29:11 crc kubenswrapper[4841]: E0313 09:29:11.386179 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r42dd" podUID="10b15182-cc2b-420b-9fc2-fe3ca6ea38d7" Mar 13 09:29:11 crc kubenswrapper[4841]: E0313 09:29:11.607694 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721" Mar 13 09:29:11 crc kubenswrapper[4841]: E0313 09:29:11.607862 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-svwkt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-776c5696bf-8f2sj_openstack-operators(8b77ae90-8ef1-4e98-9d32-319dfdd55a6d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 09:29:11 crc kubenswrapper[4841]: E0313 09:29:11.609036 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-8f2sj" podUID="8b77ae90-8ef1-4e98-9d32-319dfdd55a6d" Mar 13 09:29:12 crc kubenswrapper[4841]: E0313 09:29:12.150881 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:9182d1816c6fdb093d6328f1b0bf39296b9eccfa495f35e2198ec4764fa6288f" Mar 13 09:29:12 crc kubenswrapper[4841]: E0313 09:29:12.151145 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:9182d1816c6fdb093d6328f1b0bf39296b9eccfa495f35e2198ec4764fa6288f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tctn5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6bbb499bbc-gb4dz_openstack-operators(98cec4ba-d672-4627-8d37-46a0684fc284): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 09:29:12 crc kubenswrapper[4841]: E0313 09:29:12.152914 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-gb4dz" podUID="98cec4ba-d672-4627-8d37-46a0684fc284" Mar 13 09:29:12 crc kubenswrapper[4841]: E0313 09:29:12.394994 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-8f2sj" podUID="8b77ae90-8ef1-4e98-9d32-319dfdd55a6d" Mar 13 09:29:12 crc kubenswrapper[4841]: E0313 09:29:12.395489 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:9182d1816c6fdb093d6328f1b0bf39296b9eccfa495f35e2198ec4764fa6288f\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-gb4dz" podUID="98cec4ba-d672-4627-8d37-46a0684fc284" Mar 13 09:29:12 crc kubenswrapper[4841]: I0313 09:29:12.425714 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert\") pod \"infra-operator-controller-manager-5995f4446f-vtz8l\" (UID: \"26236923-39c0-4b46-be0d-61f453533891\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" Mar 13 09:29:12 crc kubenswrapper[4841]: E0313 09:29:12.426740 4841 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 09:29:12 crc kubenswrapper[4841]: E0313 09:29:12.426808 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert podName:26236923-39c0-4b46-be0d-61f453533891 nodeName:}" failed. No retries permitted until 2026-03-13 09:29:28.426782553 +0000 UTC m=+1051.156682844 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert") pod "infra-operator-controller-manager-5995f4446f-vtz8l" (UID: "26236923-39c0-4b46-be0d-61f453533891") : secret "infra-operator-webhook-server-cert" not found Mar 13 09:29:12 crc kubenswrapper[4841]: E0313 09:29:12.658318 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 13 09:29:12 crc kubenswrapper[4841]: E0313 09:29:12.658520 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zppfn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-56nwc_openstack-operators(4a76403b-081b-4222-a707-4cd00dd440a0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 09:29:12 crc kubenswrapper[4841]: E0313 09:29:12.659722 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-56nwc" podUID="4a76403b-081b-4222-a707-4cd00dd440a0" Mar 13 09:29:12 crc kubenswrapper[4841]: I0313 09:29:12.730081 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06f0c42d-1674-4913-8a86-1d1749d8d601-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7lccpq\" (UID: \"06f0c42d-1674-4913-8a86-1d1749d8d601\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" Mar 13 09:29:12 crc kubenswrapper[4841]: I0313 09:29:12.755971 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06f0c42d-1674-4913-8a86-1d1749d8d601-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7lccpq\" (UID: \"06f0c42d-1674-4913-8a86-1d1749d8d601\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" Mar 13 09:29:12 crc kubenswrapper[4841]: I0313 09:29:12.800951 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-729ml" Mar 13 09:29:12 crc kubenswrapper[4841]: I0313 09:29:12.809898 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" Mar 13 09:29:13 crc kubenswrapper[4841]: I0313 09:29:13.136693 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:29:13 crc kubenswrapper[4841]: I0313 09:29:13.136844 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:29:13 crc kubenswrapper[4841]: E0313 09:29:13.137016 4841 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 09:29:13 crc kubenswrapper[4841]: E0313 09:29:13.137073 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs podName:2c86df2d-15dc-45f2-aca7-4200fdf36a53 nodeName:}" failed. No retries permitted until 2026-03-13 09:29:29.137058563 +0000 UTC m=+1051.866958744 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs") pod "openstack-operator-controller-manager-57ddc6f479-h7khw" (UID: "2c86df2d-15dc-45f2-aca7-4200fdf36a53") : secret "metrics-server-cert" not found Mar 13 09:29:13 crc kubenswrapper[4841]: E0313 09:29:13.137471 4841 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 09:29:13 crc kubenswrapper[4841]: E0313 09:29:13.137569 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs podName:2c86df2d-15dc-45f2-aca7-4200fdf36a53 nodeName:}" failed. No retries permitted until 2026-03-13 09:29:29.137548139 +0000 UTC m=+1051.867448340 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs") pod "openstack-operator-controller-manager-57ddc6f479-h7khw" (UID: "2c86df2d-15dc-45f2-aca7-4200fdf36a53") : secret "webhook-server-cert" not found Mar 13 09:29:13 crc kubenswrapper[4841]: E0313 09:29:13.190501 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922" Mar 13 09:29:13 crc kubenswrapper[4841]: E0313 09:29:13.190726 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5787w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-569cc54c5-chf9m_openstack-operators(f34e0b2d-5c3c-4725-ae0c-760bf98e90d3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 09:29:13 crc kubenswrapper[4841]: E0313 09:29:13.191888 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-chf9m" podUID="f34e0b2d-5c3c-4725-ae0c-760bf98e90d3" Mar 13 09:29:13 crc kubenswrapper[4841]: E0313 09:29:13.404435 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-56nwc" podUID="4a76403b-081b-4222-a707-4cd00dd440a0" Mar 13 09:29:13 crc kubenswrapper[4841]: E0313 09:29:13.404895 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922\\\"\"" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-chf9m" podUID="f34e0b2d-5c3c-4725-ae0c-760bf98e90d3" Mar 13 09:29:13 crc kubenswrapper[4841]: I0313 09:29:13.720293 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq"] Mar 13 09:29:13 crc kubenswrapper[4841]: W0313 09:29:13.784833 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06f0c42d_1674_4913_8a86_1d1749d8d601.slice/crio-5759d6e6d5dd8c437e50d52243118ff2b4d5c2c309f453f20bda854c48871818 WatchSource:0}: Error finding container 5759d6e6d5dd8c437e50d52243118ff2b4d5c2c309f453f20bda854c48871818: Status 404 returned error can't find the container with id 5759d6e6d5dd8c437e50d52243118ff2b4d5c2c309f453f20bda854c48871818 Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.418167 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" event={"ID":"06f0c42d-1674-4913-8a86-1d1749d8d601","Type":"ContainerStarted","Data":"5759d6e6d5dd8c437e50d52243118ff2b4d5c2c309f453f20bda854c48871818"} Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.424726 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-cgkfz" event={"ID":"7bd9dd2f-b4fd-4078-b463-4e970fa6791d","Type":"ContainerStarted","Data":"04bf65810dd363ceaf3b3f423119efa8f99e8c8ea32c7cd918804d68a9c9847a"} Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.424790 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-cgkfz" Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.427646 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7lhjg" event={"ID":"46ca7c55-bd68-4454-a014-85f81f1b5a60","Type":"ContainerStarted","Data":"2e757d9f99a12747a7d2fedf86921b4e1a5fa0de14edb8669b90856f99f6cda4"} Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.427826 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7lhjg" Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.430986 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-bwvx6" event={"ID":"49a6b3bc-4db3-4006-b033-cc9cfa0cb5fc","Type":"ContainerStarted","Data":"43cbfb2d8dd288eae7db98e2bc7b327ffd0a1e645907c4fa4a4c300e99e8f8a8"} Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.431637 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-bwvx6" Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.436190 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"8dc018ca0f90a95ed2ddf32ba76ace2f8d1b621b17b9ee14fcc045e6a5af19f7"} Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.444788 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-kc2zl" event={"ID":"c6c9dfcd-5298-468b-9de2-0280bf525b61","Type":"ContainerStarted","Data":"1bb3599d99e788677949a20536d96e74009cd1a08890df6fd1afddaddf415f27"} Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.444867 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-kc2zl" Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.446525 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-cgkfz" podStartSLOduration=2.900689331 podStartE2EDuration="18.446514307s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:28:57.659183716 +0000 UTC m=+1020.389083907" lastFinishedPulling="2026-03-13 09:29:13.205008692 +0000 UTC m=+1035.934908883" observedRunningTime="2026-03-13 09:29:14.43922488 +0000 UTC m=+1037.169125081" watchObservedRunningTime="2026-03-13 09:29:14.446514307 +0000 UTC m=+1037.176414498" Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.455388 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-j7v7h" event={"ID":"3a7d5a0b-0bd7-4735-b182-8a78870050cf","Type":"ContainerStarted","Data":"4c89a9f9bd9c51839567c2902c6b08cc7629e8e0937c234d7443a79bea0910f0"} Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.455699 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-j7v7h" Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.465602 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-fkfjh" event={"ID":"9db8c27e-023c-4e28-a381-24f4438a6add","Type":"ContainerStarted","Data":"0df0ab5c22b0954803f5482b82bc3d7b8c3f881daf802e41506da42f9e2eb092"} Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.465721 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-fkfjh" Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.468260 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-grn97" event={"ID":"bad08b57-dde0-496d-8ea1-5845a52d517a","Type":"ContainerStarted","Data":"190268b2638145f71eafaa17c033644e32d406808c04d7b410bc4c2f8119e30c"} Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.468391 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-grn97" Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.471024 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rx4wh" event={"ID":"90c1dec3-4daa-4ac6-b95e-209cb8bd9b55","Type":"ContainerStarted","Data":"ff95da7dd4e6d097b49e4ced0d48c5cc25d5700b9c67fad60700727007b3fdd5"} Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.471478 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rx4wh" Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.478059 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7lhjg" podStartSLOduration=3.109570872 podStartE2EDuration="18.478046235s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:28:57.816879469 +0000 UTC m=+1020.546779660" lastFinishedPulling="2026-03-13 09:29:13.185354832 +0000 UTC m=+1035.915255023" observedRunningTime="2026-03-13 09:29:14.470697887 +0000 UTC m=+1037.200598068" watchObservedRunningTime="2026-03-13 09:29:14.478046235 +0000 UTC m=+1037.207946426" Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.499492 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-bwvx6" podStartSLOduration=3.401193323 podStartE2EDuration="18.499475421s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:28:58.111371858 +0000 UTC m=+1020.841272049" lastFinishedPulling="2026-03-13 09:29:13.209653956 +0000 UTC m=+1035.939554147" observedRunningTime="2026-03-13 09:29:14.48916004 +0000 UTC m=+1037.219060231" watchObservedRunningTime="2026-03-13 09:29:14.499475421 +0000 UTC m=+1037.229375612" Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.524812 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rx4wh" podStartSLOduration=3.266203443 podStartE2EDuration="18.524796216s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:28:57.951067643 +0000 UTC m=+1020.680967824" lastFinishedPulling="2026-03-13 09:29:13.209660406 +0000 UTC m=+1035.939560597" observedRunningTime="2026-03-13 09:29:14.518656386 +0000 UTC m=+1037.248556577" watchObservedRunningTime="2026-03-13 09:29:14.524796216 +0000 UTC m=+1037.254696407" Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.563474 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-fkfjh" podStartSLOduration=2.672204411 podStartE2EDuration="18.563453006s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:28:57.318937858 +0000 UTC m=+1020.048838049" lastFinishedPulling="2026-03-13 09:29:13.210186453 +0000 UTC m=+1035.940086644" observedRunningTime="2026-03-13 09:29:14.545906161 +0000 UTC m=+1037.275806352" watchObservedRunningTime="2026-03-13 09:29:14.563453006 +0000 UTC m=+1037.293353197" Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.592069 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-grn97" podStartSLOduration=3.252168548 podStartE2EDuration="18.592053273s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:28:57.845434166 +0000 UTC m=+1020.575334357" lastFinishedPulling="2026-03-13 09:29:13.185318891 +0000 UTC m=+1035.915219082" observedRunningTime="2026-03-13 09:29:14.586911894 +0000 UTC m=+1037.316812095" watchObservedRunningTime="2026-03-13 09:29:14.592053273 +0000 UTC m=+1037.321953464" Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.592512 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-kc2zl" podStartSLOduration=3.03183733 podStartE2EDuration="18.592507948s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:28:57.647501403 +0000 UTC m=+1020.377401594" lastFinishedPulling="2026-03-13 09:29:13.208172021 +0000 UTC m=+1035.938072212" observedRunningTime="2026-03-13 09:29:14.56390744 +0000 UTC m=+1037.293807651" watchObservedRunningTime="2026-03-13 09:29:14.592507948 +0000 UTC m=+1037.322408139" Mar 13 09:29:14 crc kubenswrapper[4841]: I0313 09:29:14.609669 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-j7v7h" podStartSLOduration=3.553560319 podStartE2EDuration="18.609625418s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:28:58.126893309 +0000 UTC m=+1020.856793500" lastFinishedPulling="2026-03-13 09:29:13.182958248 +0000 UTC m=+1035.912858599" observedRunningTime="2026-03-13 09:29:14.602038743 +0000 UTC m=+1037.331938934" watchObservedRunningTime="2026-03-13 09:29:14.609625418 +0000 UTC m=+1037.339525609" Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.527431 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-8485bdb9db-mf5lp" event={"ID":"17824e5f-18b3-46c0-910a-56e5529e09c3","Type":"ContainerStarted","Data":"b3bc0a9a49ffb97a19f097a2dad5d9efdacd80b9aa771d9d292d287d11212b5a"} Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.528952 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-pvtvx" event={"ID":"f5ae27e8-47b9-437c-9506-f51da1b6c9f8","Type":"ContainerStarted","Data":"46c8fe498f0c8273820dcaba826319756ba56e10d35e847addf060e6e389de0c"} Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.529149 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-8485bdb9db-mf5lp" Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.530236 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" event={"ID":"06f0c42d-1674-4913-8a86-1d1749d8d601","Type":"ContainerStarted","Data":"88d3610675967d5c297f9f2851e28dfb167cbb01380c98956a451fb1bc8418de"} Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.530372 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.531676 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8v6pq" event={"ID":"35cf5dc3-b1c0-4481-8f0f-8bca19ecadd1","Type":"ContainerStarted","Data":"8dbd808efddf798b7f46355d464da50414e364225df86ae01eaeb1f20fe13af8"} Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.531899 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8v6pq" Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.540942 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-8tg7w" event={"ID":"b877a309-e752-4f24-90cd-6901973263e3","Type":"ContainerStarted","Data":"f89eb5dff3f5f40184125d748dbec78a51bc34d7104146ced939ef075de4daff"} Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.541646 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-8tg7w" Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.542977 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nd9d9" event={"ID":"daeb73dc-4973-4a0b-906d-4afc7f61717c","Type":"ContainerStarted","Data":"1bd6c63c08b3df11918ff7b0772d0d9af61528aecec5d4fd4d4a7c2c0b8ad50f"} Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.544607 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9gcr6" event={"ID":"44d006b0-b13e-49ce-8ff8-592f3d8798c1","Type":"ContainerStarted","Data":"ea9c8560e2a34da22d47a54065fcf8d2d4283edf9dafb9342a0f606a831f8ca5"} Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.545022 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9gcr6" Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.613419 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9gcr6" podStartSLOduration=3.197288398 podStartE2EDuration="24.613399191s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:28:58.374926525 +0000 UTC m=+1021.104826716" lastFinishedPulling="2026-03-13 09:29:19.791037318 +0000 UTC m=+1042.520937509" observedRunningTime="2026-03-13 09:29:20.603490395 +0000 UTC m=+1043.333390596" watchObservedRunningTime="2026-03-13 09:29:20.613399191 +0000 UTC m=+1043.343299382" Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.613714 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-8485bdb9db-mf5lp" podStartSLOduration=3.183927126 podStartE2EDuration="24.613709351s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:28:58.372057696 +0000 UTC m=+1021.101957887" lastFinishedPulling="2026-03-13 09:29:19.801839881 +0000 UTC m=+1042.531740112" observedRunningTime="2026-03-13 09:29:20.571539659 +0000 UTC m=+1043.301439850" watchObservedRunningTime="2026-03-13 09:29:20.613709351 +0000 UTC m=+1043.343609542" Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.642705 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nd9d9" podStartSLOduration=2.186998062 podStartE2EDuration="23.642686244s" podCreationTimestamp="2026-03-13 09:28:57 +0000 UTC" firstStartedPulling="2026-03-13 09:28:58.376789794 +0000 UTC m=+1021.106689985" lastFinishedPulling="2026-03-13 09:29:19.832477976 +0000 UTC m=+1042.562378167" observedRunningTime="2026-03-13 09:29:20.642480098 +0000 UTC m=+1043.372380289" watchObservedRunningTime="2026-03-13 09:29:20.642686244 +0000 UTC m=+1043.372586435" Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.670882 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-8tg7w" podStartSLOduration=3.244700252 podStartE2EDuration="24.670864883s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:28:58.37666759 +0000 UTC m=+1021.106567781" lastFinishedPulling="2026-03-13 09:29:19.802832221 +0000 UTC m=+1042.532732412" observedRunningTime="2026-03-13 09:29:20.665485488 +0000 UTC m=+1043.395385689" watchObservedRunningTime="2026-03-13 09:29:20.670864883 +0000 UTC m=+1043.400765084" Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.693050 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8v6pq" podStartSLOduration=3.278550035 podStartE2EDuration="24.693028787s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:28:58.375632737 +0000 UTC m=+1021.105532928" lastFinishedPulling="2026-03-13 09:29:19.790111489 +0000 UTC m=+1042.520011680" observedRunningTime="2026-03-13 09:29:20.69052886 +0000 UTC m=+1043.420429051" watchObservedRunningTime="2026-03-13 09:29:20.693028787 +0000 UTC m=+1043.422928978" Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.716008 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-pvtvx" podStartSLOduration=2.866476581 podStartE2EDuration="23.715991876s" podCreationTimestamp="2026-03-13 09:28:57 +0000 UTC" firstStartedPulling="2026-03-13 09:28:58.406700692 +0000 UTC m=+1021.136600883" lastFinishedPulling="2026-03-13 09:29:19.256215987 +0000 UTC m=+1041.986116178" observedRunningTime="2026-03-13 09:29:20.715723648 +0000 UTC m=+1043.445623859" watchObservedRunningTime="2026-03-13 09:29:20.715991876 +0000 UTC m=+1043.445892067" Mar 13 09:29:20 crc kubenswrapper[4841]: I0313 09:29:20.753457 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" podStartSLOduration=18.791365191 podStartE2EDuration="24.753442352s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:29:13.788391085 +0000 UTC m=+1036.518291276" lastFinishedPulling="2026-03-13 09:29:19.750468246 +0000 UTC m=+1042.480368437" observedRunningTime="2026-03-13 09:29:20.750431699 +0000 UTC m=+1043.480331890" watchObservedRunningTime="2026-03-13 09:29:20.753442352 +0000 UTC m=+1043.483342543" Mar 13 09:29:23 crc kubenswrapper[4841]: I0313 09:29:23.574578 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r42dd" event={"ID":"10b15182-cc2b-420b-9fc2-fe3ca6ea38d7","Type":"ContainerStarted","Data":"38d542eff321d568a45fac8490f1d5183e9a73a80ef1519ac31177f7df779e01"} Mar 13 09:29:23 crc kubenswrapper[4841]: I0313 09:29:23.576410 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r42dd" Mar 13 09:29:23 crc kubenswrapper[4841]: I0313 09:29:23.601628 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r42dd" podStartSLOduration=3.52796952 podStartE2EDuration="27.601591318s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:28:58.357832296 +0000 UTC m=+1021.087732487" lastFinishedPulling="2026-03-13 09:29:22.431454064 +0000 UTC m=+1045.161354285" observedRunningTime="2026-03-13 09:29:23.597042168 +0000 UTC m=+1046.326942399" watchObservedRunningTime="2026-03-13 09:29:23.601591318 +0000 UTC m=+1046.331491559" Mar 13 09:29:25 crc kubenswrapper[4841]: I0313 09:29:25.597414 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-hm824" event={"ID":"e6b9c8a5-3093-4d94-ad46-cd682158fdf8","Type":"ContainerStarted","Data":"d03ca2d25eeea6f7c6f39230913fb8dfc2a9f8a3e93391a6342641f5cc371311"} Mar 13 09:29:25 crc kubenswrapper[4841]: I0313 09:29:25.597950 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-hm824" Mar 13 09:29:25 crc kubenswrapper[4841]: I0313 09:29:25.599531 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-56nwc" event={"ID":"4a76403b-081b-4222-a707-4cd00dd440a0","Type":"ContainerStarted","Data":"c88f8cae073debc68a84c4f4b430244e096ef41667b225b068989165b65fa6ba"} Mar 13 09:29:25 crc kubenswrapper[4841]: I0313 09:29:25.599971 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-56nwc" Mar 13 09:29:25 crc kubenswrapper[4841]: I0313 09:29:25.632050 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-hm824" podStartSLOduration=3.130032875 podStartE2EDuration="29.632024715s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:28:57.928791372 +0000 UTC m=+1020.658691563" lastFinishedPulling="2026-03-13 09:29:24.430783172 +0000 UTC m=+1047.160683403" observedRunningTime="2026-03-13 09:29:25.62471286 +0000 UTC m=+1048.354613061" watchObservedRunningTime="2026-03-13 09:29:25.632024715 +0000 UTC m=+1048.361924956" Mar 13 09:29:25 crc kubenswrapper[4841]: I0313 09:29:25.643487 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-56nwc" podStartSLOduration=3.162364156 podStartE2EDuration="29.643472068s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:28:57.950868347 +0000 UTC m=+1020.680768538" lastFinishedPulling="2026-03-13 09:29:24.431976229 +0000 UTC m=+1047.161876450" observedRunningTime="2026-03-13 09:29:25.642732815 +0000 UTC m=+1048.372633046" watchObservedRunningTime="2026-03-13 09:29:25.643472068 +0000 UTC m=+1048.373372279" Mar 13 09:29:26 crc kubenswrapper[4841]: I0313 09:29:26.611335 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-8f2sj" event={"ID":"8b77ae90-8ef1-4e98-9d32-319dfdd55a6d","Type":"ContainerStarted","Data":"cd3685d76fe56abd6f6cdc43137dd361bbea20ce68d77b0ccb3160e76fc52df8"} Mar 13 09:29:26 crc kubenswrapper[4841]: I0313 09:29:26.611649 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-8f2sj" Mar 13 09:29:26 crc kubenswrapper[4841]: I0313 09:29:26.616606 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-gb4dz" event={"ID":"98cec4ba-d672-4627-8d37-46a0684fc284","Type":"ContainerStarted","Data":"4d25a9767607e21440ea06e4c444c35aee9b5a0a6a6d76b6fe471cee323b825d"} Mar 13 09:29:26 crc kubenswrapper[4841]: I0313 09:29:26.633474 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-8f2sj" podStartSLOduration=2.597266276 podStartE2EDuration="30.633448794s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:28:58.368955691 +0000 UTC m=+1021.098855882" lastFinishedPulling="2026-03-13 09:29:26.405138219 +0000 UTC m=+1049.135038400" observedRunningTime="2026-03-13 09:29:26.628228182 +0000 UTC m=+1049.358128413" watchObservedRunningTime="2026-03-13 09:29:26.633448794 +0000 UTC m=+1049.363349005" Mar 13 09:29:26 crc kubenswrapper[4841]: I0313 09:29:26.650637 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-gb4dz" podStartSLOduration=2.943803582 podStartE2EDuration="30.650615712s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:28:57.829434999 +0000 UTC m=+1020.559335190" lastFinishedPulling="2026-03-13 09:29:25.536247119 +0000 UTC m=+1048.266147320" observedRunningTime="2026-03-13 09:29:26.646773084 +0000 UTC m=+1049.376673275" watchObservedRunningTime="2026-03-13 09:29:26.650615712 +0000 UTC m=+1049.380515913" Mar 13 09:29:26 crc kubenswrapper[4841]: I0313 09:29:26.768520 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-fkfjh" Mar 13 09:29:26 crc kubenswrapper[4841]: I0313 09:29:26.774097 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-cgkfz" Mar 13 09:29:26 crc kubenswrapper[4841]: I0313 09:29:26.837444 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-kc2zl" Mar 13 09:29:26 crc kubenswrapper[4841]: I0313 09:29:26.867682 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7lhjg" Mar 13 09:29:26 crc kubenswrapper[4841]: I0313 09:29:26.923988 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-grn97" Mar 13 09:29:26 crc kubenswrapper[4841]: I0313 09:29:26.939832 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-gb4dz" Mar 13 09:29:27 crc kubenswrapper[4841]: I0313 09:29:27.047450 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9gcr6" Mar 13 09:29:27 crc kubenswrapper[4841]: I0313 09:29:27.068766 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-rx4wh" Mar 13 09:29:27 crc kubenswrapper[4841]: I0313 09:29:27.177357 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-8tg7w" Mar 13 09:29:27 crc kubenswrapper[4841]: I0313 09:29:27.257126 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-j7v7h" Mar 13 09:29:27 crc kubenswrapper[4841]: I0313 09:29:27.331392 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-bwvx6" Mar 13 09:29:27 crc kubenswrapper[4841]: I0313 09:29:27.484516 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r42dd" Mar 13 09:29:27 crc kubenswrapper[4841]: I0313 09:29:27.519896 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-8485bdb9db-mf5lp" Mar 13 09:29:27 crc kubenswrapper[4841]: I0313 09:29:27.559355 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8v6pq" Mar 13 09:29:27 crc kubenswrapper[4841]: I0313 09:29:27.635432 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-pvtvx" Mar 13 09:29:27 crc kubenswrapper[4841]: I0313 09:29:27.637121 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-pvtvx" Mar 13 09:29:28 crc kubenswrapper[4841]: I0313 09:29:28.475710 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert\") pod \"infra-operator-controller-manager-5995f4446f-vtz8l\" (UID: \"26236923-39c0-4b46-be0d-61f453533891\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" Mar 13 09:29:28 crc kubenswrapper[4841]: I0313 09:29:28.485454 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26236923-39c0-4b46-be0d-61f453533891-cert\") pod \"infra-operator-controller-manager-5995f4446f-vtz8l\" (UID: \"26236923-39c0-4b46-be0d-61f453533891\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" Mar 13 09:29:28 crc kubenswrapper[4841]: I0313 09:29:28.680126 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-2rtrk" Mar 13 09:29:28 crc kubenswrapper[4841]: I0313 09:29:28.688629 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" Mar 13 09:29:29 crc kubenswrapper[4841]: I0313 09:29:29.130820 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l"] Mar 13 09:29:29 crc kubenswrapper[4841]: W0313 09:29:29.133548 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26236923_39c0_4b46_be0d_61f453533891.slice/crio-fc9462df730aca288ed2006c6044c50d472b3d32461f212a33d4f2f61e107cb5 WatchSource:0}: Error finding container fc9462df730aca288ed2006c6044c50d472b3d32461f212a33d4f2f61e107cb5: Status 404 returned error can't find the container with id fc9462df730aca288ed2006c6044c50d472b3d32461f212a33d4f2f61e107cb5 Mar 13 09:29:29 crc kubenswrapper[4841]: I0313 09:29:29.186283 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:29:29 crc kubenswrapper[4841]: I0313 09:29:29.186415 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:29:29 crc kubenswrapper[4841]: I0313 09:29:29.194751 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-metrics-certs\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:29:29 crc kubenswrapper[4841]: I0313 09:29:29.195455 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c86df2d-15dc-45f2-aca7-4200fdf36a53-webhook-certs\") pod \"openstack-operator-controller-manager-57ddc6f479-h7khw\" (UID: \"2c86df2d-15dc-45f2-aca7-4200fdf36a53\") " pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:29:29 crc kubenswrapper[4841]: I0313 09:29:29.452989 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-7jsqx" Mar 13 09:29:29 crc kubenswrapper[4841]: I0313 09:29:29.461393 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:29:29 crc kubenswrapper[4841]: I0313 09:29:29.645580 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-chf9m" event={"ID":"f34e0b2d-5c3c-4725-ae0c-760bf98e90d3","Type":"ContainerStarted","Data":"ec133a6eab3307b5355898dc6442ecf879fd359799b311fa8c69f6103e20953c"} Mar 13 09:29:29 crc kubenswrapper[4841]: I0313 09:29:29.645818 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-chf9m" Mar 13 09:29:29 crc kubenswrapper[4841]: I0313 09:29:29.647967 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" event={"ID":"26236923-39c0-4b46-be0d-61f453533891","Type":"ContainerStarted","Data":"fc9462df730aca288ed2006c6044c50d472b3d32461f212a33d4f2f61e107cb5"} Mar 13 09:29:29 crc kubenswrapper[4841]: I0313 09:29:29.667799 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-chf9m" podStartSLOduration=2.752286019 podStartE2EDuration="33.667781315s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:28:58.314383847 +0000 UTC m=+1021.044284038" lastFinishedPulling="2026-03-13 09:29:29.229879133 +0000 UTC m=+1051.959779334" observedRunningTime="2026-03-13 09:29:29.664808033 +0000 UTC m=+1052.394708244" watchObservedRunningTime="2026-03-13 09:29:29.667781315 +0000 UTC m=+1052.397681516" Mar 13 09:29:29 crc kubenswrapper[4841]: I0313 09:29:29.749863 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw"] Mar 13 09:29:29 crc kubenswrapper[4841]: W0313 09:29:29.756180 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c86df2d_15dc_45f2_aca7_4200fdf36a53.slice/crio-1da32ffc0b50b6e22570ec753bd77eb2550ba345b18b48838572c70d384398cb WatchSource:0}: Error finding container 1da32ffc0b50b6e22570ec753bd77eb2550ba345b18b48838572c70d384398cb: Status 404 returned error can't find the container with id 1da32ffc0b50b6e22570ec753bd77eb2550ba345b18b48838572c70d384398cb Mar 13 09:29:30 crc kubenswrapper[4841]: I0313 09:29:30.655945 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" event={"ID":"2c86df2d-15dc-45f2-aca7-4200fdf36a53","Type":"ContainerStarted","Data":"8aed9406484bab5fd16629e3949a4a9beb5ad3f32f6b562d14860a9e7544cbf1"} Mar 13 09:29:30 crc kubenswrapper[4841]: I0313 09:29:30.656547 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" event={"ID":"2c86df2d-15dc-45f2-aca7-4200fdf36a53","Type":"ContainerStarted","Data":"1da32ffc0b50b6e22570ec753bd77eb2550ba345b18b48838572c70d384398cb"} Mar 13 09:29:30 crc kubenswrapper[4841]: I0313 09:29:30.656578 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:29:30 crc kubenswrapper[4841]: I0313 09:29:30.680250 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" podStartSLOduration=33.680231142 podStartE2EDuration="33.680231142s" podCreationTimestamp="2026-03-13 09:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:29:30.675856128 +0000 UTC m=+1053.405756329" watchObservedRunningTime="2026-03-13 09:29:30.680231142 +0000 UTC m=+1053.410131333" Mar 13 09:29:31 crc kubenswrapper[4841]: I0313 09:29:31.664686 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" event={"ID":"26236923-39c0-4b46-be0d-61f453533891","Type":"ContainerStarted","Data":"eb8f486e4409e8b3eba559f10b2b9069fa653df2ea2ce5da76653a4edabd01e1"} Mar 13 09:29:31 crc kubenswrapper[4841]: I0313 09:29:31.687556 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" podStartSLOduration=33.829722781 podStartE2EDuration="35.687539582s" podCreationTimestamp="2026-03-13 09:28:56 +0000 UTC" firstStartedPulling="2026-03-13 09:29:29.136038598 +0000 UTC m=+1051.865938829" lastFinishedPulling="2026-03-13 09:29:30.993855439 +0000 UTC m=+1053.723755630" observedRunningTime="2026-03-13 09:29:31.682514857 +0000 UTC m=+1054.412415078" watchObservedRunningTime="2026-03-13 09:29:31.687539582 +0000 UTC m=+1054.417439783" Mar 13 09:29:32 crc kubenswrapper[4841]: I0313 09:29:32.671102 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" Mar 13 09:29:32 crc kubenswrapper[4841]: I0313 09:29:32.818071 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7lccpq" Mar 13 09:29:36 crc kubenswrapper[4841]: I0313 09:29:36.908581 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-hm824" Mar 13 09:29:36 crc kubenswrapper[4841]: I0313 09:29:36.942571 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-gb4dz" Mar 13 09:29:37 crc kubenswrapper[4841]: I0313 09:29:37.023257 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-56nwc" Mar 13 09:29:37 crc kubenswrapper[4841]: I0313 09:29:37.150123 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-8f2sj" Mar 13 09:29:37 crc kubenswrapper[4841]: I0313 09:29:37.173425 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-chf9m" Mar 13 09:29:38 crc kubenswrapper[4841]: I0313 09:29:38.695474 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vtz8l" Mar 13 09:29:39 crc kubenswrapper[4841]: I0313 09:29:39.473071 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-57ddc6f479-h7khw" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.121222 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bkp2w"] Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.123253 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bkp2w" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.125298 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.127627 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-xxdrh" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.127790 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.127923 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.135199 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bkp2w"] Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.167476 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fzt86"] Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.168683 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fzt86" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.173792 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fzt86"] Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.180064 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.309343 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/375ba94d-ec2a-4e94-96fb-2f205f7115b9-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fzt86\" (UID: \"375ba94d-ec2a-4e94-96fb-2f205f7115b9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fzt86" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.309412 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slftm\" (UniqueName: \"kubernetes.io/projected/375ba94d-ec2a-4e94-96fb-2f205f7115b9-kube-api-access-slftm\") pod \"dnsmasq-dns-78dd6ddcc-fzt86\" (UID: \"375ba94d-ec2a-4e94-96fb-2f205f7115b9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fzt86" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.309442 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/487af2ae-3b25-48b9-a77e-fa1ed59ed8a0-config\") pod \"dnsmasq-dns-675f4bcbfc-bkp2w\" (UID: \"487af2ae-3b25-48b9-a77e-fa1ed59ed8a0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bkp2w" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.309477 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84hrv\" (UniqueName: \"kubernetes.io/projected/487af2ae-3b25-48b9-a77e-fa1ed59ed8a0-kube-api-access-84hrv\") pod \"dnsmasq-dns-675f4bcbfc-bkp2w\" (UID: \"487af2ae-3b25-48b9-a77e-fa1ed59ed8a0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bkp2w" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.309508 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375ba94d-ec2a-4e94-96fb-2f205f7115b9-config\") pod \"dnsmasq-dns-78dd6ddcc-fzt86\" (UID: \"375ba94d-ec2a-4e94-96fb-2f205f7115b9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fzt86" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.411053 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slftm\" (UniqueName: \"kubernetes.io/projected/375ba94d-ec2a-4e94-96fb-2f205f7115b9-kube-api-access-slftm\") pod \"dnsmasq-dns-78dd6ddcc-fzt86\" (UID: \"375ba94d-ec2a-4e94-96fb-2f205f7115b9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fzt86" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.411477 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/487af2ae-3b25-48b9-a77e-fa1ed59ed8a0-config\") pod \"dnsmasq-dns-675f4bcbfc-bkp2w\" (UID: \"487af2ae-3b25-48b9-a77e-fa1ed59ed8a0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bkp2w" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.411627 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84hrv\" (UniqueName: \"kubernetes.io/projected/487af2ae-3b25-48b9-a77e-fa1ed59ed8a0-kube-api-access-84hrv\") pod \"dnsmasq-dns-675f4bcbfc-bkp2w\" (UID: \"487af2ae-3b25-48b9-a77e-fa1ed59ed8a0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bkp2w" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.411779 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375ba94d-ec2a-4e94-96fb-2f205f7115b9-config\") pod \"dnsmasq-dns-78dd6ddcc-fzt86\" (UID: \"375ba94d-ec2a-4e94-96fb-2f205f7115b9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fzt86" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.411962 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/375ba94d-ec2a-4e94-96fb-2f205f7115b9-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fzt86\" (UID: \"375ba94d-ec2a-4e94-96fb-2f205f7115b9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fzt86" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.412796 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/487af2ae-3b25-48b9-a77e-fa1ed59ed8a0-config\") pod \"dnsmasq-dns-675f4bcbfc-bkp2w\" (UID: \"487af2ae-3b25-48b9-a77e-fa1ed59ed8a0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bkp2w" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.412903 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375ba94d-ec2a-4e94-96fb-2f205f7115b9-config\") pod \"dnsmasq-dns-78dd6ddcc-fzt86\" (UID: \"375ba94d-ec2a-4e94-96fb-2f205f7115b9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fzt86" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.413133 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/375ba94d-ec2a-4e94-96fb-2f205f7115b9-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fzt86\" (UID: \"375ba94d-ec2a-4e94-96fb-2f205f7115b9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fzt86" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.434697 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84hrv\" (UniqueName: \"kubernetes.io/projected/487af2ae-3b25-48b9-a77e-fa1ed59ed8a0-kube-api-access-84hrv\") pod \"dnsmasq-dns-675f4bcbfc-bkp2w\" (UID: \"487af2ae-3b25-48b9-a77e-fa1ed59ed8a0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bkp2w" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.434823 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slftm\" (UniqueName: \"kubernetes.io/projected/375ba94d-ec2a-4e94-96fb-2f205f7115b9-kube-api-access-slftm\") pod \"dnsmasq-dns-78dd6ddcc-fzt86\" (UID: \"375ba94d-ec2a-4e94-96fb-2f205f7115b9\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fzt86" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.452394 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bkp2w" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.509593 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fzt86" Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.777326 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fzt86"] Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.878715 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-fzt86" event={"ID":"375ba94d-ec2a-4e94-96fb-2f205f7115b9","Type":"ContainerStarted","Data":"da1356a59b2931ea703ef28c1e4e71db31b4ce33b6bb02d5aef208e56ffb6eff"} Mar 13 09:29:56 crc kubenswrapper[4841]: W0313 09:29:56.913858 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod487af2ae_3b25_48b9_a77e_fa1ed59ed8a0.slice/crio-4ab943ed2d6ba707872d36ea47cfd86823cee254a0e8df14a92433f2a5cea1ce WatchSource:0}: Error finding container 4ab943ed2d6ba707872d36ea47cfd86823cee254a0e8df14a92433f2a5cea1ce: Status 404 returned error can't find the container with id 4ab943ed2d6ba707872d36ea47cfd86823cee254a0e8df14a92433f2a5cea1ce Mar 13 09:29:56 crc kubenswrapper[4841]: I0313 09:29:56.915374 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bkp2w"] Mar 13 09:29:57 crc kubenswrapper[4841]: I0313 09:29:57.898763 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-bkp2w" event={"ID":"487af2ae-3b25-48b9-a77e-fa1ed59ed8a0","Type":"ContainerStarted","Data":"4ab943ed2d6ba707872d36ea47cfd86823cee254a0e8df14a92433f2a5cea1ce"} Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.001495 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bkp2w"] Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.015313 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-h8cgb"] Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.016386 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.027908 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-h8cgb"] Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.056170 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef6ab592-8f4f-4c5c-acad-a3420b519edc-config\") pod \"dnsmasq-dns-5ccc8479f9-h8cgb\" (UID: \"ef6ab592-8f4f-4c5c-acad-a3420b519edc\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.056279 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p25t9\" (UniqueName: \"kubernetes.io/projected/ef6ab592-8f4f-4c5c-acad-a3420b519edc-kube-api-access-p25t9\") pod \"dnsmasq-dns-5ccc8479f9-h8cgb\" (UID: \"ef6ab592-8f4f-4c5c-acad-a3420b519edc\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.056384 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef6ab592-8f4f-4c5c-acad-a3420b519edc-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-h8cgb\" (UID: \"ef6ab592-8f4f-4c5c-acad-a3420b519edc\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.158595 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef6ab592-8f4f-4c5c-acad-a3420b519edc-config\") pod \"dnsmasq-dns-5ccc8479f9-h8cgb\" (UID: \"ef6ab592-8f4f-4c5c-acad-a3420b519edc\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.158681 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p25t9\" (UniqueName: \"kubernetes.io/projected/ef6ab592-8f4f-4c5c-acad-a3420b519edc-kube-api-access-p25t9\") pod \"dnsmasq-dns-5ccc8479f9-h8cgb\" (UID: \"ef6ab592-8f4f-4c5c-acad-a3420b519edc\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.158765 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef6ab592-8f4f-4c5c-acad-a3420b519edc-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-h8cgb\" (UID: \"ef6ab592-8f4f-4c5c-acad-a3420b519edc\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.159795 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef6ab592-8f4f-4c5c-acad-a3420b519edc-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-h8cgb\" (UID: \"ef6ab592-8f4f-4c5c-acad-a3420b519edc\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.160117 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef6ab592-8f4f-4c5c-acad-a3420b519edc-config\") pod \"dnsmasq-dns-5ccc8479f9-h8cgb\" (UID: \"ef6ab592-8f4f-4c5c-acad-a3420b519edc\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.190739 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p25t9\" (UniqueName: \"kubernetes.io/projected/ef6ab592-8f4f-4c5c-acad-a3420b519edc-kube-api-access-p25t9\") pod \"dnsmasq-dns-5ccc8479f9-h8cgb\" (UID: \"ef6ab592-8f4f-4c5c-acad-a3420b519edc\") " pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.272390 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fzt86"] Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.299555 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-97qpq"] Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.302055 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.322159 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-97qpq"] Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.341783 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.361404 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c88febfd-c495-4b2e-a4fb-fca8f447ef9c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-97qpq\" (UID: \"c88febfd-c495-4b2e-a4fb-fca8f447ef9c\") " pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.361514 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x9sx\" (UniqueName: \"kubernetes.io/projected/c88febfd-c495-4b2e-a4fb-fca8f447ef9c-kube-api-access-5x9sx\") pod \"dnsmasq-dns-57d769cc4f-97qpq\" (UID: \"c88febfd-c495-4b2e-a4fb-fca8f447ef9c\") " pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.361588 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c88febfd-c495-4b2e-a4fb-fca8f447ef9c-config\") pod \"dnsmasq-dns-57d769cc4f-97qpq\" (UID: \"c88febfd-c495-4b2e-a4fb-fca8f447ef9c\") " pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.468902 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x9sx\" (UniqueName: \"kubernetes.io/projected/c88febfd-c495-4b2e-a4fb-fca8f447ef9c-kube-api-access-5x9sx\") pod \"dnsmasq-dns-57d769cc4f-97qpq\" (UID: \"c88febfd-c495-4b2e-a4fb-fca8f447ef9c\") " pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.469443 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c88febfd-c495-4b2e-a4fb-fca8f447ef9c-config\") pod \"dnsmasq-dns-57d769cc4f-97qpq\" (UID: \"c88febfd-c495-4b2e-a4fb-fca8f447ef9c\") " pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.469596 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c88febfd-c495-4b2e-a4fb-fca8f447ef9c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-97qpq\" (UID: \"c88febfd-c495-4b2e-a4fb-fca8f447ef9c\") " pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.471206 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c88febfd-c495-4b2e-a4fb-fca8f447ef9c-config\") pod \"dnsmasq-dns-57d769cc4f-97qpq\" (UID: \"c88febfd-c495-4b2e-a4fb-fca8f447ef9c\") " pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.471860 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c88febfd-c495-4b2e-a4fb-fca8f447ef9c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-97qpq\" (UID: \"c88febfd-c495-4b2e-a4fb-fca8f447ef9c\") " pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.527697 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x9sx\" (UniqueName: \"kubernetes.io/projected/c88febfd-c495-4b2e-a4fb-fca8f447ef9c-kube-api-access-5x9sx\") pod \"dnsmasq-dns-57d769cc4f-97qpq\" (UID: \"c88febfd-c495-4b2e-a4fb-fca8f447ef9c\") " pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.632698 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" Mar 13 09:29:59 crc kubenswrapper[4841]: I0313 09:29:59.786921 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-h8cgb"] Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.117257 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-97qpq"] Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.128070 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl"] Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.129051 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.140043 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.142383 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556570-cc6b9"] Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.143255 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556570-cc6b9" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.143622 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.146694 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.146947 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.147151 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl"] Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.147282 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.152974 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556570-cc6b9"] Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.192337 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.194084 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.200817 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-s4lsp" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.201011 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.201143 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.201345 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.201557 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.201683 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.202333 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.206471 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.220153 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89099fdc-154e-4290-9dbf-31dad846ead6-config-volume\") pod \"collect-profiles-29556570-496sl\" (UID: \"89099fdc-154e-4290-9dbf-31dad846ead6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.322767 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.323092 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jksrs\" (UniqueName: \"kubernetes.io/projected/ea6882c8-841d-4ca7-90a9-3d16c4303a58-kube-api-access-jksrs\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.323133 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ea6882c8-841d-4ca7-90a9-3d16c4303a58-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.323155 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpglz\" (UniqueName: \"kubernetes.io/projected/89099fdc-154e-4290-9dbf-31dad846ead6-kube-api-access-qpglz\") pod \"collect-profiles-29556570-496sl\" (UID: \"89099fdc-154e-4290-9dbf-31dad846ead6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.323175 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89099fdc-154e-4290-9dbf-31dad846ead6-secret-volume\") pod \"collect-profiles-29556570-496sl\" (UID: \"89099fdc-154e-4290-9dbf-31dad846ead6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.323204 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ea6882c8-841d-4ca7-90a9-3d16c4303a58-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.323231 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks68g\" (UniqueName: \"kubernetes.io/projected/72dabbac-c073-43b6-a4fc-7c49b98138c3-kube-api-access-ks68g\") pod \"auto-csr-approver-29556570-cc6b9\" (UID: \"72dabbac-c073-43b6-a4fc-7c49b98138c3\") " pod="openshift-infra/auto-csr-approver-29556570-cc6b9" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.323280 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89099fdc-154e-4290-9dbf-31dad846ead6-config-volume\") pod \"collect-profiles-29556570-496sl\" (UID: \"89099fdc-154e-4290-9dbf-31dad846ead6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.323297 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.323311 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ea6882c8-841d-4ca7-90a9-3d16c4303a58-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.323329 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ea6882c8-841d-4ca7-90a9-3d16c4303a58-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.323347 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea6882c8-841d-4ca7-90a9-3d16c4303a58-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.323360 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.323394 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.323413 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.324590 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89099fdc-154e-4290-9dbf-31dad846ead6-config-volume\") pod \"collect-profiles-29556570-496sl\" (UID: \"89099fdc-154e-4290-9dbf-31dad846ead6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.414299 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.415704 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.418108 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.418240 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.418323 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.418453 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.418573 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.418646 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-l6rdh" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.419402 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.424293 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.424336 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.424356 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.424387 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jksrs\" (UniqueName: \"kubernetes.io/projected/ea6882c8-841d-4ca7-90a9-3d16c4303a58-kube-api-access-jksrs\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.424406 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ea6882c8-841d-4ca7-90a9-3d16c4303a58-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.424429 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpglz\" (UniqueName: \"kubernetes.io/projected/89099fdc-154e-4290-9dbf-31dad846ead6-kube-api-access-qpglz\") pod \"collect-profiles-29556570-496sl\" (UID: \"89099fdc-154e-4290-9dbf-31dad846ead6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.424446 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89099fdc-154e-4290-9dbf-31dad846ead6-secret-volume\") pod \"collect-profiles-29556570-496sl\" (UID: \"89099fdc-154e-4290-9dbf-31dad846ead6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.424474 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ea6882c8-841d-4ca7-90a9-3d16c4303a58-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.424498 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks68g\" (UniqueName: \"kubernetes.io/projected/72dabbac-c073-43b6-a4fc-7c49b98138c3-kube-api-access-ks68g\") pod \"auto-csr-approver-29556570-cc6b9\" (UID: \"72dabbac-c073-43b6-a4fc-7c49b98138c3\") " pod="openshift-infra/auto-csr-approver-29556570-cc6b9" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.424525 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.424543 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ea6882c8-841d-4ca7-90a9-3d16c4303a58-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.424555 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ea6882c8-841d-4ca7-90a9-3d16c4303a58-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.424571 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea6882c8-841d-4ca7-90a9-3d16c4303a58-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.424585 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.425700 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.425945 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.426657 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.429611 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.429919 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.430375 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ea6882c8-841d-4ca7-90a9-3d16c4303a58-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.431108 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea6882c8-841d-4ca7-90a9-3d16c4303a58-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.431140 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ea6882c8-841d-4ca7-90a9-3d16c4303a58-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.431454 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.431825 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89099fdc-154e-4290-9dbf-31dad846ead6-secret-volume\") pod \"collect-profiles-29556570-496sl\" (UID: \"89099fdc-154e-4290-9dbf-31dad846ead6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.437880 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ea6882c8-841d-4ca7-90a9-3d16c4303a58-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.444528 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ea6882c8-841d-4ca7-90a9-3d16c4303a58-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.452707 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks68g\" (UniqueName: \"kubernetes.io/projected/72dabbac-c073-43b6-a4fc-7c49b98138c3-kube-api-access-ks68g\") pod \"auto-csr-approver-29556570-cc6b9\" (UID: \"72dabbac-c073-43b6-a4fc-7c49b98138c3\") " pod="openshift-infra/auto-csr-approver-29556570-cc6b9" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.456140 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpglz\" (UniqueName: \"kubernetes.io/projected/89099fdc-154e-4290-9dbf-31dad846ead6-kube-api-access-qpglz\") pod \"collect-profiles-29556570-496sl\" (UID: \"89099fdc-154e-4290-9dbf-31dad846ead6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.461221 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jksrs\" (UniqueName: \"kubernetes.io/projected/ea6882c8-841d-4ca7-90a9-3d16c4303a58-kube-api-access-jksrs\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.462301 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.463446 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.481087 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556570-cc6b9" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.525723 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.525790 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f270332-4a01-403b-8c06-0f8c0bff6527-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.525836 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26lqr\" (UniqueName: \"kubernetes.io/projected/6f270332-4a01-403b-8c06-0f8c0bff6527-kube-api-access-26lqr\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.525865 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.525888 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f270332-4a01-403b-8c06-0f8c0bff6527-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.525920 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f270332-4a01-403b-8c06-0f8c0bff6527-config-data\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.525955 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f270332-4a01-403b-8c06-0f8c0bff6527-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.525981 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.526033 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.526054 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.526074 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f270332-4a01-403b-8c06-0f8c0bff6527-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.561536 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.631525 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f270332-4a01-403b-8c06-0f8c0bff6527-config-data\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.631637 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f270332-4a01-403b-8c06-0f8c0bff6527-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.631693 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.631809 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.631857 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.631893 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f270332-4a01-403b-8c06-0f8c0bff6527-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.632011 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.632036 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f270332-4a01-403b-8c06-0f8c0bff6527-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.632081 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26lqr\" (UniqueName: \"kubernetes.io/projected/6f270332-4a01-403b-8c06-0f8c0bff6527-kube-api-access-26lqr\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.632110 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.632133 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f270332-4a01-403b-8c06-0f8c0bff6527-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.633167 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.633467 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.634051 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.634044 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f270332-4a01-403b-8c06-0f8c0bff6527-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.634336 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f270332-4a01-403b-8c06-0f8c0bff6527-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.634355 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f270332-4a01-403b-8c06-0f8c0bff6527-config-data\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.641153 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.661111 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f270332-4a01-403b-8c06-0f8c0bff6527-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.661410 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.662372 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f270332-4a01-403b-8c06-0f8c0bff6527-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.664181 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26lqr\" (UniqueName: \"kubernetes.io/projected/6f270332-4a01-403b-8c06-0f8c0bff6527-kube-api-access-26lqr\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.691147 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " pod="openstack/rabbitmq-server-0" Mar 13 09:30:00 crc kubenswrapper[4841]: I0313 09:30:00.821467 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.742129 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.744638 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.749423 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.750135 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.750349 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.750960 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-2xtr4" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.753323 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.762752 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.855155 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/125cd366-c483-4efa-a55f-85b888bf6266-config-data-generated\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.855206 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/125cd366-c483-4efa-a55f-85b888bf6266-kolla-config\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.855412 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/125cd366-c483-4efa-a55f-85b888bf6266-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.855472 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlwmn\" (UniqueName: \"kubernetes.io/projected/125cd366-c483-4efa-a55f-85b888bf6266-kube-api-access-vlwmn\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.855493 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/125cd366-c483-4efa-a55f-85b888bf6266-operator-scripts\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.855511 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125cd366-c483-4efa-a55f-85b888bf6266-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.855678 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/125cd366-c483-4efa-a55f-85b888bf6266-config-data-default\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.855729 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.957046 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/125cd366-c483-4efa-a55f-85b888bf6266-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.957126 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlwmn\" (UniqueName: \"kubernetes.io/projected/125cd366-c483-4efa-a55f-85b888bf6266-kube-api-access-vlwmn\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.957143 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/125cd366-c483-4efa-a55f-85b888bf6266-operator-scripts\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.957157 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125cd366-c483-4efa-a55f-85b888bf6266-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.957189 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/125cd366-c483-4efa-a55f-85b888bf6266-config-data-default\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.957207 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.957231 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/125cd366-c483-4efa-a55f-85b888bf6266-config-data-generated\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.957245 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/125cd366-c483-4efa-a55f-85b888bf6266-kolla-config\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.957942 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/125cd366-c483-4efa-a55f-85b888bf6266-kolla-config\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.958646 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.958983 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/125cd366-c483-4efa-a55f-85b888bf6266-config-data-generated\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.959550 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/125cd366-c483-4efa-a55f-85b888bf6266-config-data-default\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.959898 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/125cd366-c483-4efa-a55f-85b888bf6266-operator-scripts\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.962739 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/125cd366-c483-4efa-a55f-85b888bf6266-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.975667 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125cd366-c483-4efa-a55f-85b888bf6266-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.979699 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlwmn\" (UniqueName: \"kubernetes.io/projected/125cd366-c483-4efa-a55f-85b888bf6266-kube-api-access-vlwmn\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:01 crc kubenswrapper[4841]: I0313 09:30:01.987472 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"125cd366-c483-4efa-a55f-85b888bf6266\") " pod="openstack/openstack-galera-0" Mar 13 09:30:02 crc kubenswrapper[4841]: I0313 09:30:02.070909 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.149223 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.150580 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.157817 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-g4c92" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.157941 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.158032 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.159708 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.163360 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.178070 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/77aa8bf5-4386-4d85-8cca-75c90d5b2593-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.178173 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77aa8bf5-4386-4d85-8cca-75c90d5b2593-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.178211 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhnxl\" (UniqueName: \"kubernetes.io/projected/77aa8bf5-4386-4d85-8cca-75c90d5b2593-kube-api-access-hhnxl\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.178246 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/77aa8bf5-4386-4d85-8cca-75c90d5b2593-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.178408 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77aa8bf5-4386-4d85-8cca-75c90d5b2593-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.178439 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/77aa8bf5-4386-4d85-8cca-75c90d5b2593-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.178488 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/77aa8bf5-4386-4d85-8cca-75c90d5b2593-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.178550 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.279806 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77aa8bf5-4386-4d85-8cca-75c90d5b2593-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.279863 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhnxl\" (UniqueName: \"kubernetes.io/projected/77aa8bf5-4386-4d85-8cca-75c90d5b2593-kube-api-access-hhnxl\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.279883 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/77aa8bf5-4386-4d85-8cca-75c90d5b2593-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.279958 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/77aa8bf5-4386-4d85-8cca-75c90d5b2593-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.279974 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77aa8bf5-4386-4d85-8cca-75c90d5b2593-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.280015 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/77aa8bf5-4386-4d85-8cca-75c90d5b2593-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.280051 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.280098 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/77aa8bf5-4386-4d85-8cca-75c90d5b2593-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.280972 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/77aa8bf5-4386-4d85-8cca-75c90d5b2593-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.282000 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77aa8bf5-4386-4d85-8cca-75c90d5b2593-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.284195 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.286231 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/77aa8bf5-4386-4d85-8cca-75c90d5b2593-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.286863 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/77aa8bf5-4386-4d85-8cca-75c90d5b2593-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.289485 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77aa8bf5-4386-4d85-8cca-75c90d5b2593-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.308126 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/77aa8bf5-4386-4d85-8cca-75c90d5b2593-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.308452 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.329500 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhnxl\" (UniqueName: \"kubernetes.io/projected/77aa8bf5-4386-4d85-8cca-75c90d5b2593-kube-api-access-hhnxl\") pod \"openstack-cell1-galera-0\" (UID: \"77aa8bf5-4386-4d85-8cca-75c90d5b2593\") " pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:03 crc kubenswrapper[4841]: I0313 09:30:03.478706 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:04 crc kubenswrapper[4841]: I0313 09:30:04.818606 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 13 09:30:04 crc kubenswrapper[4841]: I0313 09:30:04.819636 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 09:30:04 crc kubenswrapper[4841]: W0313 09:30:04.834214 4841 reflector.go:561] object-"openstack"/"memcached-memcached-dockercfg-zwj47": failed to list *v1.Secret: secrets "memcached-memcached-dockercfg-zwj47" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Mar 13 09:30:04 crc kubenswrapper[4841]: E0313 09:30:04.834287 4841 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"memcached-memcached-dockercfg-zwj47\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"memcached-memcached-dockercfg-zwj47\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 13 09:30:04 crc kubenswrapper[4841]: W0313 09:30:04.834839 4841 reflector.go:561] object-"openstack"/"cert-memcached-svc": failed to list *v1.Secret: secrets "cert-memcached-svc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Mar 13 09:30:04 crc kubenswrapper[4841]: E0313 09:30:04.834867 4841 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"cert-memcached-svc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cert-memcached-svc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 13 09:30:04 crc kubenswrapper[4841]: W0313 09:30:04.835024 4841 reflector.go:561] object-"openstack"/"memcached-config-data": failed to list *v1.ConfigMap: configmaps "memcached-config-data" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Mar 13 09:30:04 crc kubenswrapper[4841]: E0313 09:30:04.835051 4841 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"memcached-config-data\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"memcached-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 13 09:30:04 crc kubenswrapper[4841]: I0313 09:30:04.850143 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 09:30:04 crc kubenswrapper[4841]: I0313 09:30:04.909404 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/831e87d6-8c27-4e98-8b3e-e6be93a93e51-config-data\") pod \"memcached-0\" (UID: \"831e87d6-8c27-4e98-8b3e-e6be93a93e51\") " pod="openstack/memcached-0" Mar 13 09:30:04 crc kubenswrapper[4841]: I0313 09:30:04.909483 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64pzj\" (UniqueName: \"kubernetes.io/projected/831e87d6-8c27-4e98-8b3e-e6be93a93e51-kube-api-access-64pzj\") pod \"memcached-0\" (UID: \"831e87d6-8c27-4e98-8b3e-e6be93a93e51\") " pod="openstack/memcached-0" Mar 13 09:30:04 crc kubenswrapper[4841]: I0313 09:30:04.909498 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831e87d6-8c27-4e98-8b3e-e6be93a93e51-combined-ca-bundle\") pod \"memcached-0\" (UID: \"831e87d6-8c27-4e98-8b3e-e6be93a93e51\") " pod="openstack/memcached-0" Mar 13 09:30:04 crc kubenswrapper[4841]: I0313 09:30:04.909528 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/831e87d6-8c27-4e98-8b3e-e6be93a93e51-memcached-tls-certs\") pod \"memcached-0\" (UID: \"831e87d6-8c27-4e98-8b3e-e6be93a93e51\") " pod="openstack/memcached-0" Mar 13 09:30:04 crc kubenswrapper[4841]: I0313 09:30:04.909547 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/831e87d6-8c27-4e98-8b3e-e6be93a93e51-kolla-config\") pod \"memcached-0\" (UID: \"831e87d6-8c27-4e98-8b3e-e6be93a93e51\") " pod="openstack/memcached-0" Mar 13 09:30:05 crc kubenswrapper[4841]: I0313 09:30:05.010700 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/831e87d6-8c27-4e98-8b3e-e6be93a93e51-config-data\") pod \"memcached-0\" (UID: \"831e87d6-8c27-4e98-8b3e-e6be93a93e51\") " pod="openstack/memcached-0" Mar 13 09:30:05 crc kubenswrapper[4841]: I0313 09:30:05.010774 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64pzj\" (UniqueName: \"kubernetes.io/projected/831e87d6-8c27-4e98-8b3e-e6be93a93e51-kube-api-access-64pzj\") pod \"memcached-0\" (UID: \"831e87d6-8c27-4e98-8b3e-e6be93a93e51\") " pod="openstack/memcached-0" Mar 13 09:30:05 crc kubenswrapper[4841]: I0313 09:30:05.010798 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831e87d6-8c27-4e98-8b3e-e6be93a93e51-combined-ca-bundle\") pod \"memcached-0\" (UID: \"831e87d6-8c27-4e98-8b3e-e6be93a93e51\") " pod="openstack/memcached-0" Mar 13 09:30:05 crc kubenswrapper[4841]: I0313 09:30:05.010842 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/831e87d6-8c27-4e98-8b3e-e6be93a93e51-memcached-tls-certs\") pod \"memcached-0\" (UID: \"831e87d6-8c27-4e98-8b3e-e6be93a93e51\") " pod="openstack/memcached-0" Mar 13 09:30:05 crc kubenswrapper[4841]: I0313 09:30:05.010872 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/831e87d6-8c27-4e98-8b3e-e6be93a93e51-kolla-config\") pod \"memcached-0\" (UID: \"831e87d6-8c27-4e98-8b3e-e6be93a93e51\") " pod="openstack/memcached-0" Mar 13 09:30:05 crc kubenswrapper[4841]: I0313 09:30:05.014711 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831e87d6-8c27-4e98-8b3e-e6be93a93e51-combined-ca-bundle\") pod \"memcached-0\" (UID: \"831e87d6-8c27-4e98-8b3e-e6be93a93e51\") " pod="openstack/memcached-0" Mar 13 09:30:05 crc kubenswrapper[4841]: W0313 09:30:05.019887 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef6ab592_8f4f_4c5c_acad_a3420b519edc.slice/crio-23a1b499f6eb29fc9ff33fb37b5b2fcd6ab2902542c99300e0a54f4a47de1b42 WatchSource:0}: Error finding container 23a1b499f6eb29fc9ff33fb37b5b2fcd6ab2902542c99300e0a54f4a47de1b42: Status 404 returned error can't find the container with id 23a1b499f6eb29fc9ff33fb37b5b2fcd6ab2902542c99300e0a54f4a47de1b42 Mar 13 09:30:05 crc kubenswrapper[4841]: I0313 09:30:05.032859 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64pzj\" (UniqueName: \"kubernetes.io/projected/831e87d6-8c27-4e98-8b3e-e6be93a93e51-kube-api-access-64pzj\") pod \"memcached-0\" (UID: \"831e87d6-8c27-4e98-8b3e-e6be93a93e51\") " pod="openstack/memcached-0" Mar 13 09:30:05 crc kubenswrapper[4841]: I0313 09:30:05.984111 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" event={"ID":"c88febfd-c495-4b2e-a4fb-fca8f447ef9c","Type":"ContainerStarted","Data":"dfc43adb517d40b32e8b29b87fa34e312b580eb12648859f929e4b27de778665"} Mar 13 09:30:05 crc kubenswrapper[4841]: I0313 09:30:05.984973 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" event={"ID":"ef6ab592-8f4f-4c5c-acad-a3420b519edc","Type":"ContainerStarted","Data":"23a1b499f6eb29fc9ff33fb37b5b2fcd6ab2902542c99300e0a54f4a47de1b42"} Mar 13 09:30:06 crc kubenswrapper[4841]: E0313 09:30:06.011183 4841 configmap.go:193] Couldn't get configMap openstack/memcached-config-data: failed to sync configmap cache: timed out waiting for the condition Mar 13 09:30:06 crc kubenswrapper[4841]: E0313 09:30:06.011237 4841 configmap.go:193] Couldn't get configMap openstack/memcached-config-data: failed to sync configmap cache: timed out waiting for the condition Mar 13 09:30:06 crc kubenswrapper[4841]: E0313 09:30:06.011254 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/831e87d6-8c27-4e98-8b3e-e6be93a93e51-config-data podName:831e87d6-8c27-4e98-8b3e-e6be93a93e51 nodeName:}" failed. No retries permitted until 2026-03-13 09:30:06.511235762 +0000 UTC m=+1089.241135953 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/831e87d6-8c27-4e98-8b3e-e6be93a93e51-config-data") pod "memcached-0" (UID: "831e87d6-8c27-4e98-8b3e-e6be93a93e51") : failed to sync configmap cache: timed out waiting for the condition Mar 13 09:30:06 crc kubenswrapper[4841]: E0313 09:30:06.011361 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/831e87d6-8c27-4e98-8b3e-e6be93a93e51-kolla-config podName:831e87d6-8c27-4e98-8b3e-e6be93a93e51 nodeName:}" failed. No retries permitted until 2026-03-13 09:30:06.511338735 +0000 UTC m=+1089.241238926 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/831e87d6-8c27-4e98-8b3e-e6be93a93e51-kolla-config") pod "memcached-0" (UID: "831e87d6-8c27-4e98-8b3e-e6be93a93e51") : failed to sync configmap cache: timed out waiting for the condition Mar 13 09:30:06 crc kubenswrapper[4841]: E0313 09:30:06.014307 4841 secret.go:188] Couldn't get secret openstack/cert-memcached-svc: failed to sync secret cache: timed out waiting for the condition Mar 13 09:30:06 crc kubenswrapper[4841]: E0313 09:30:06.014372 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/831e87d6-8c27-4e98-8b3e-e6be93a93e51-memcached-tls-certs podName:831e87d6-8c27-4e98-8b3e-e6be93a93e51 nodeName:}" failed. No retries permitted until 2026-03-13 09:30:06.514359838 +0000 UTC m=+1089.244260029 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/831e87d6-8c27-4e98-8b3e-e6be93a93e51-memcached-tls-certs") pod "memcached-0" (UID: "831e87d6-8c27-4e98-8b3e-e6be93a93e51") : failed to sync secret cache: timed out waiting for the condition Mar 13 09:30:06 crc kubenswrapper[4841]: I0313 09:30:06.065653 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 13 09:30:06 crc kubenswrapper[4841]: I0313 09:30:06.153014 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 13 09:30:06 crc kubenswrapper[4841]: I0313 09:30:06.182620 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 09:30:06 crc kubenswrapper[4841]: I0313 09:30:06.183585 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 09:30:06 crc kubenswrapper[4841]: I0313 09:30:06.186151 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-7z44d" Mar 13 09:30:06 crc kubenswrapper[4841]: I0313 09:30:06.208694 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 09:30:06 crc kubenswrapper[4841]: I0313 09:30:06.233763 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmdb8\" (UniqueName: \"kubernetes.io/projected/a7d41010-f9ab-45b0-9d21-e05037f26651-kube-api-access-xmdb8\") pod \"kube-state-metrics-0\" (UID: \"a7d41010-f9ab-45b0-9d21-e05037f26651\") " pod="openstack/kube-state-metrics-0" Mar 13 09:30:06 crc kubenswrapper[4841]: I0313 09:30:06.314140 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-zwj47" Mar 13 09:30:06 crc kubenswrapper[4841]: I0313 09:30:06.335170 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmdb8\" (UniqueName: \"kubernetes.io/projected/a7d41010-f9ab-45b0-9d21-e05037f26651-kube-api-access-xmdb8\") pod \"kube-state-metrics-0\" (UID: \"a7d41010-f9ab-45b0-9d21-e05037f26651\") " pod="openstack/kube-state-metrics-0" Mar 13 09:30:06 crc kubenswrapper[4841]: I0313 09:30:06.358463 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmdb8\" (UniqueName: \"kubernetes.io/projected/a7d41010-f9ab-45b0-9d21-e05037f26651-kube-api-access-xmdb8\") pod \"kube-state-metrics-0\" (UID: \"a7d41010-f9ab-45b0-9d21-e05037f26651\") " pod="openstack/kube-state-metrics-0" Mar 13 09:30:06 crc kubenswrapper[4841]: I0313 09:30:06.500168 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 09:30:06 crc kubenswrapper[4841]: I0313 09:30:06.539082 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/831e87d6-8c27-4e98-8b3e-e6be93a93e51-memcached-tls-certs\") pod \"memcached-0\" (UID: \"831e87d6-8c27-4e98-8b3e-e6be93a93e51\") " pod="openstack/memcached-0" Mar 13 09:30:06 crc kubenswrapper[4841]: I0313 09:30:06.539146 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/831e87d6-8c27-4e98-8b3e-e6be93a93e51-kolla-config\") pod \"memcached-0\" (UID: \"831e87d6-8c27-4e98-8b3e-e6be93a93e51\") " pod="openstack/memcached-0" Mar 13 09:30:06 crc kubenswrapper[4841]: I0313 09:30:06.539230 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/831e87d6-8c27-4e98-8b3e-e6be93a93e51-config-data\") pod \"memcached-0\" (UID: \"831e87d6-8c27-4e98-8b3e-e6be93a93e51\") " pod="openstack/memcached-0" Mar 13 09:30:06 crc kubenswrapper[4841]: I0313 09:30:06.540175 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/831e87d6-8c27-4e98-8b3e-e6be93a93e51-config-data\") pod \"memcached-0\" (UID: \"831e87d6-8c27-4e98-8b3e-e6be93a93e51\") " pod="openstack/memcached-0" Mar 13 09:30:06 crc kubenswrapper[4841]: I0313 09:30:06.540330 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/831e87d6-8c27-4e98-8b3e-e6be93a93e51-kolla-config\") pod \"memcached-0\" (UID: \"831e87d6-8c27-4e98-8b3e-e6be93a93e51\") " pod="openstack/memcached-0" Mar 13 09:30:06 crc kubenswrapper[4841]: I0313 09:30:06.543428 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/831e87d6-8c27-4e98-8b3e-e6be93a93e51-memcached-tls-certs\") pod \"memcached-0\" (UID: \"831e87d6-8c27-4e98-8b3e-e6be93a93e51\") " pod="openstack/memcached-0" Mar 13 09:30:06 crc kubenswrapper[4841]: I0313 09:30:06.638389 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.378171 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.622645 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-bqlfl"] Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.630141 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.639474 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.639819 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-7kjtx" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.654498 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bqlfl"] Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.664228 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-b2w62"] Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.666878 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.670218 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b2bf634d-aa4f-4773-91ee-99616e217c82-var-run-ovn\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.670277 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b2bf634d-aa4f-4773-91ee-99616e217c82-var-run\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.670302 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2bf634d-aa4f-4773-91ee-99616e217c82-ovn-controller-tls-certs\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.670359 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2bf634d-aa4f-4773-91ee-99616e217c82-scripts\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.670388 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b2bf634d-aa4f-4773-91ee-99616e217c82-var-log-ovn\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.670408 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2bf634d-aa4f-4773-91ee-99616e217c82-combined-ca-bundle\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.670466 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmdt2\" (UniqueName: \"kubernetes.io/projected/b2bf634d-aa4f-4773-91ee-99616e217c82-kube-api-access-qmdt2\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.672631 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.677816 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-b2w62"] Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.772982 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2bf634d-aa4f-4773-91ee-99616e217c82-scripts\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.773035 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/84f12283-3c15-408e-a1a2-691c257434ca-var-log\") pod \"ovn-controller-ovs-b2w62\" (UID: \"84f12283-3c15-408e-a1a2-691c257434ca\") " pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.773055 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b2bf634d-aa4f-4773-91ee-99616e217c82-var-log-ovn\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.773072 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2bf634d-aa4f-4773-91ee-99616e217c82-combined-ca-bundle\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.773088 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/84f12283-3c15-408e-a1a2-691c257434ca-var-lib\") pod \"ovn-controller-ovs-b2w62\" (UID: \"84f12283-3c15-408e-a1a2-691c257434ca\") " pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.773116 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42rpn\" (UniqueName: \"kubernetes.io/projected/84f12283-3c15-408e-a1a2-691c257434ca-kube-api-access-42rpn\") pod \"ovn-controller-ovs-b2w62\" (UID: \"84f12283-3c15-408e-a1a2-691c257434ca\") " pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.773159 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmdt2\" (UniqueName: \"kubernetes.io/projected/b2bf634d-aa4f-4773-91ee-99616e217c82-kube-api-access-qmdt2\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.773200 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b2bf634d-aa4f-4773-91ee-99616e217c82-var-run-ovn\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.773214 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84f12283-3c15-408e-a1a2-691c257434ca-scripts\") pod \"ovn-controller-ovs-b2w62\" (UID: \"84f12283-3c15-408e-a1a2-691c257434ca\") " pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.773232 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b2bf634d-aa4f-4773-91ee-99616e217c82-var-run\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.773254 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2bf634d-aa4f-4773-91ee-99616e217c82-ovn-controller-tls-certs\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.773287 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/84f12283-3c15-408e-a1a2-691c257434ca-etc-ovs\") pod \"ovn-controller-ovs-b2w62\" (UID: \"84f12283-3c15-408e-a1a2-691c257434ca\") " pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.773314 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/84f12283-3c15-408e-a1a2-691c257434ca-var-run\") pod \"ovn-controller-ovs-b2w62\" (UID: \"84f12283-3c15-408e-a1a2-691c257434ca\") " pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.774956 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b2bf634d-aa4f-4773-91ee-99616e217c82-var-log-ovn\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.775140 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b2bf634d-aa4f-4773-91ee-99616e217c82-var-run\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.775238 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b2bf634d-aa4f-4773-91ee-99616e217c82-var-run-ovn\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.775452 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2bf634d-aa4f-4773-91ee-99616e217c82-scripts\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.788713 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2bf634d-aa4f-4773-91ee-99616e217c82-combined-ca-bundle\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.790932 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmdt2\" (UniqueName: \"kubernetes.io/projected/b2bf634d-aa4f-4773-91ee-99616e217c82-kube-api-access-qmdt2\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.794870 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2bf634d-aa4f-4773-91ee-99616e217c82-ovn-controller-tls-certs\") pod \"ovn-controller-bqlfl\" (UID: \"b2bf634d-aa4f-4773-91ee-99616e217c82\") " pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.875096 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/84f12283-3c15-408e-a1a2-691c257434ca-etc-ovs\") pod \"ovn-controller-ovs-b2w62\" (UID: \"84f12283-3c15-408e-a1a2-691c257434ca\") " pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.875158 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/84f12283-3c15-408e-a1a2-691c257434ca-var-run\") pod \"ovn-controller-ovs-b2w62\" (UID: \"84f12283-3c15-408e-a1a2-691c257434ca\") " pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.875498 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/84f12283-3c15-408e-a1a2-691c257434ca-etc-ovs\") pod \"ovn-controller-ovs-b2w62\" (UID: \"84f12283-3c15-408e-a1a2-691c257434ca\") " pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.875522 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/84f12283-3c15-408e-a1a2-691c257434ca-var-log\") pod \"ovn-controller-ovs-b2w62\" (UID: \"84f12283-3c15-408e-a1a2-691c257434ca\") " pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.875553 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/84f12283-3c15-408e-a1a2-691c257434ca-var-lib\") pod \"ovn-controller-ovs-b2w62\" (UID: \"84f12283-3c15-408e-a1a2-691c257434ca\") " pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.875581 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/84f12283-3c15-408e-a1a2-691c257434ca-var-run\") pod \"ovn-controller-ovs-b2w62\" (UID: \"84f12283-3c15-408e-a1a2-691c257434ca\") " pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.875598 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42rpn\" (UniqueName: \"kubernetes.io/projected/84f12283-3c15-408e-a1a2-691c257434ca-kube-api-access-42rpn\") pod \"ovn-controller-ovs-b2w62\" (UID: \"84f12283-3c15-408e-a1a2-691c257434ca\") " pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.875708 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84f12283-3c15-408e-a1a2-691c257434ca-scripts\") pod \"ovn-controller-ovs-b2w62\" (UID: \"84f12283-3c15-408e-a1a2-691c257434ca\") " pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.875714 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/84f12283-3c15-408e-a1a2-691c257434ca-var-lib\") pod \"ovn-controller-ovs-b2w62\" (UID: \"84f12283-3c15-408e-a1a2-691c257434ca\") " pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.875979 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/84f12283-3c15-408e-a1a2-691c257434ca-var-log\") pod \"ovn-controller-ovs-b2w62\" (UID: \"84f12283-3c15-408e-a1a2-691c257434ca\") " pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.878021 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84f12283-3c15-408e-a1a2-691c257434ca-scripts\") pod \"ovn-controller-ovs-b2w62\" (UID: \"84f12283-3c15-408e-a1a2-691c257434ca\") " pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.895316 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42rpn\" (UniqueName: \"kubernetes.io/projected/84f12283-3c15-408e-a1a2-691c257434ca-kube-api-access-42rpn\") pod \"ovn-controller-ovs-b2w62\" (UID: \"84f12283-3c15-408e-a1a2-691c257434ca\") " pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:08 crc kubenswrapper[4841]: I0313 09:30:08.985002 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:09 crc kubenswrapper[4841]: I0313 09:30:09.003946 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.298317 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.302906 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.304883 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.305329 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.305549 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.305733 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-786fv" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.305956 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.307661 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.405467 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f3149e4-fc32-4773-ac07-785c8d11888e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.405529 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f3149e4-fc32-4773-ac07-785c8d11888e-config\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.405588 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5f3149e4-fc32-4773-ac07-785c8d11888e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.405613 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlc6r\" (UniqueName: \"kubernetes.io/projected/5f3149e4-fc32-4773-ac07-785c8d11888e-kube-api-access-jlc6r\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.405658 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.405687 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f3149e4-fc32-4773-ac07-785c8d11888e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.408767 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f3149e4-fc32-4773-ac07-785c8d11888e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.408832 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f3149e4-fc32-4773-ac07-785c8d11888e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.510192 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f3149e4-fc32-4773-ac07-785c8d11888e-config\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.510307 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5f3149e4-fc32-4773-ac07-785c8d11888e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.510339 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlc6r\" (UniqueName: \"kubernetes.io/projected/5f3149e4-fc32-4773-ac07-785c8d11888e-kube-api-access-jlc6r\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.510390 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.510424 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f3149e4-fc32-4773-ac07-785c8d11888e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.510472 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f3149e4-fc32-4773-ac07-785c8d11888e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.510509 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f3149e4-fc32-4773-ac07-785c8d11888e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.510553 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f3149e4-fc32-4773-ac07-785c8d11888e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.510764 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.511219 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5f3149e4-fc32-4773-ac07-785c8d11888e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.514233 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f3149e4-fc32-4773-ac07-785c8d11888e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.514430 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f3149e4-fc32-4773-ac07-785c8d11888e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.514685 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f3149e4-fc32-4773-ac07-785c8d11888e-config\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.523675 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f3149e4-fc32-4773-ac07-785c8d11888e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.525192 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f3149e4-fc32-4773-ac07-785c8d11888e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.530013 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlc6r\" (UniqueName: \"kubernetes.io/projected/5f3149e4-fc32-4773-ac07-785c8d11888e-kube-api-access-jlc6r\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.531459 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5f3149e4-fc32-4773-ac07-785c8d11888e\") " pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:10 crc kubenswrapper[4841]: I0313 09:30:10.636105 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:12 crc kubenswrapper[4841]: I0313 09:30:12.864009 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 09:30:12 crc kubenswrapper[4841]: I0313 09:30:12.865315 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:12 crc kubenswrapper[4841]: I0313 09:30:12.867137 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 13 09:30:12 crc kubenswrapper[4841]: I0313 09:30:12.867519 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 13 09:30:12 crc kubenswrapper[4841]: I0313 09:30:12.867606 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 13 09:30:12 crc kubenswrapper[4841]: I0313 09:30:12.869201 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-g2nbf" Mar 13 09:30:12 crc kubenswrapper[4841]: I0313 09:30:12.880795 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 09:30:12 crc kubenswrapper[4841]: I0313 09:30:12.981567 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-config\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:12 crc kubenswrapper[4841]: I0313 09:30:12.981652 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:12 crc kubenswrapper[4841]: I0313 09:30:12.981678 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:12 crc kubenswrapper[4841]: I0313 09:30:12.981694 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:12 crc kubenswrapper[4841]: I0313 09:30:12.981736 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wwtj\" (UniqueName: \"kubernetes.io/projected/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-kube-api-access-5wwtj\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:12 crc kubenswrapper[4841]: I0313 09:30:12.981758 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:12 crc kubenswrapper[4841]: I0313 09:30:12.981774 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:12 crc kubenswrapper[4841]: I0313 09:30:12.981800 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.042765 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"125cd366-c483-4efa-a55f-85b888bf6266","Type":"ContainerStarted","Data":"c6436ab027fee8b5da97d5f385472080b0050f69694d2ec9935a9dbef19a8071"} Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.085045 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.085090 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.085110 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.085152 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wwtj\" (UniqueName: \"kubernetes.io/projected/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-kube-api-access-5wwtj\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.085182 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.085208 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.085248 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.085326 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-config\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.085584 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.085929 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.086451 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-config\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.086484 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.091208 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.101902 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wwtj\" (UniqueName: \"kubernetes.io/projected/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-kube-api-access-5wwtj\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.108081 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.110778 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/984ac552-8ac1-4cbf-ada9-10a9dc02acd9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.114030 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"984ac552-8ac1-4cbf-ada9-10a9dc02acd9\") " pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.189163 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:13 crc kubenswrapper[4841]: I0313 09:30:13.373889 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl"] Mar 13 09:30:13 crc kubenswrapper[4841]: W0313 09:30:13.796839 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89099fdc_154e_4290_9dbf_31dad846ead6.slice/crio-19f96a0eac91349a71434d111133f2519bef651e993f98f455ab00c2764dd377 WatchSource:0}: Error finding container 19f96a0eac91349a71434d111133f2519bef651e993f98f455ab00c2764dd377: Status 404 returned error can't find the container with id 19f96a0eac91349a71434d111133f2519bef651e993f98f455ab00c2764dd377 Mar 13 09:30:13 crc kubenswrapper[4841]: E0313 09:30:13.819573 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 13 09:30:13 crc kubenswrapper[4841]: E0313 09:30:13.819784 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-84hrv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-bkp2w_openstack(487af2ae-3b25-48b9-a77e-fa1ed59ed8a0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 09:30:13 crc kubenswrapper[4841]: E0313 09:30:13.820954 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-bkp2w" podUID="487af2ae-3b25-48b9-a77e-fa1ed59ed8a0" Mar 13 09:30:13 crc kubenswrapper[4841]: E0313 09:30:13.826706 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 13 09:30:13 crc kubenswrapper[4841]: E0313 09:30:13.826889 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-slftm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-fzt86_openstack(375ba94d-ec2a-4e94-96fb-2f205f7115b9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 09:30:13 crc kubenswrapper[4841]: E0313 09:30:13.828213 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-fzt86" podUID="375ba94d-ec2a-4e94-96fb-2f205f7115b9" Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.071939 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl" event={"ID":"89099fdc-154e-4290-9dbf-31dad846ead6","Type":"ContainerStarted","Data":"19f96a0eac91349a71434d111133f2519bef651e993f98f455ab00c2764dd377"} Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.274463 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556570-cc6b9"] Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.285278 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.355681 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.451258 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.569784 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bkp2w" Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.581518 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fzt86" Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.627827 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84hrv\" (UniqueName: \"kubernetes.io/projected/487af2ae-3b25-48b9-a77e-fa1ed59ed8a0-kube-api-access-84hrv\") pod \"487af2ae-3b25-48b9-a77e-fa1ed59ed8a0\" (UID: \"487af2ae-3b25-48b9-a77e-fa1ed59ed8a0\") " Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.627909 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/487af2ae-3b25-48b9-a77e-fa1ed59ed8a0-config\") pod \"487af2ae-3b25-48b9-a77e-fa1ed59ed8a0\" (UID: \"487af2ae-3b25-48b9-a77e-fa1ed59ed8a0\") " Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.627925 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/375ba94d-ec2a-4e94-96fb-2f205f7115b9-dns-svc\") pod \"375ba94d-ec2a-4e94-96fb-2f205f7115b9\" (UID: \"375ba94d-ec2a-4e94-96fb-2f205f7115b9\") " Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.627951 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375ba94d-ec2a-4e94-96fb-2f205f7115b9-config\") pod \"375ba94d-ec2a-4e94-96fb-2f205f7115b9\" (UID: \"375ba94d-ec2a-4e94-96fb-2f205f7115b9\") " Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.628210 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slftm\" (UniqueName: \"kubernetes.io/projected/375ba94d-ec2a-4e94-96fb-2f205f7115b9-kube-api-access-slftm\") pod \"375ba94d-ec2a-4e94-96fb-2f205f7115b9\" (UID: \"375ba94d-ec2a-4e94-96fb-2f205f7115b9\") " Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.629113 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.630023 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/375ba94d-ec2a-4e94-96fb-2f205f7115b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "375ba94d-ec2a-4e94-96fb-2f205f7115b9" (UID: "375ba94d-ec2a-4e94-96fb-2f205f7115b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.631413 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/375ba94d-ec2a-4e94-96fb-2f205f7115b9-config" (OuterVolumeSpecName: "config") pod "375ba94d-ec2a-4e94-96fb-2f205f7115b9" (UID: "375ba94d-ec2a-4e94-96fb-2f205f7115b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.635217 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/375ba94d-ec2a-4e94-96fb-2f205f7115b9-kube-api-access-slftm" (OuterVolumeSpecName: "kube-api-access-slftm") pod "375ba94d-ec2a-4e94-96fb-2f205f7115b9" (UID: "375ba94d-ec2a-4e94-96fb-2f205f7115b9"). InnerVolumeSpecName "kube-api-access-slftm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.636301 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/487af2ae-3b25-48b9-a77e-fa1ed59ed8a0-config" (OuterVolumeSpecName: "config") pod "487af2ae-3b25-48b9-a77e-fa1ed59ed8a0" (UID: "487af2ae-3b25-48b9-a77e-fa1ed59ed8a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.638225 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bqlfl"] Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.639457 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/487af2ae-3b25-48b9-a77e-fa1ed59ed8a0-kube-api-access-84hrv" (OuterVolumeSpecName: "kube-api-access-84hrv") pod "487af2ae-3b25-48b9-a77e-fa1ed59ed8a0" (UID: "487af2ae-3b25-48b9-a77e-fa1ed59ed8a0"). InnerVolumeSpecName "kube-api-access-84hrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:30:14 crc kubenswrapper[4841]: W0313 09:30:14.646920 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2bf634d_aa4f_4773_91ee_99616e217c82.slice/crio-d0334e2088054c50a930818e74150bb4e33571b298c5d768187fd47cc547738c WatchSource:0}: Error finding container d0334e2088054c50a930818e74150bb4e33571b298c5d768187fd47cc547738c: Status 404 returned error can't find the container with id d0334e2088054c50a930818e74150bb4e33571b298c5d768187fd47cc547738c Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.650547 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 09:30:14 crc kubenswrapper[4841]: W0313 09:30:14.656230 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f270332_4a01_403b_8c06_0f8c0bff6527.slice/crio-1190f6327df593c8b43d5c515c3a5002bbe6ab8ca9a6f88326c69354f0aefef9 WatchSource:0}: Error finding container 1190f6327df593c8b43d5c515c3a5002bbe6ab8ca9a6f88326c69354f0aefef9: Status 404 returned error can't find the container with id 1190f6327df593c8b43d5c515c3a5002bbe6ab8ca9a6f88326c69354f0aefef9 Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.730158 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slftm\" (UniqueName: \"kubernetes.io/projected/375ba94d-ec2a-4e94-96fb-2f205f7115b9-kube-api-access-slftm\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.730565 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84hrv\" (UniqueName: \"kubernetes.io/projected/487af2ae-3b25-48b9-a77e-fa1ed59ed8a0-kube-api-access-84hrv\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.730623 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/375ba94d-ec2a-4e94-96fb-2f205f7115b9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.730699 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/487af2ae-3b25-48b9-a77e-fa1ed59ed8a0-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.730782 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/375ba94d-ec2a-4e94-96fb-2f205f7115b9-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.827819 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 09:30:14 crc kubenswrapper[4841]: W0313 09:30:14.837760 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod984ac552_8ac1_4cbf_ada9_10a9dc02acd9.slice/crio-e9453a3f200860588d4f0a954acf0fa740fb1661b04426b9c065669e20326c9c WatchSource:0}: Error finding container e9453a3f200860588d4f0a954acf0fa740fb1661b04426b9c065669e20326c9c: Status 404 returned error can't find the container with id e9453a3f200860588d4f0a954acf0fa740fb1661b04426b9c065669e20326c9c Mar 13 09:30:14 crc kubenswrapper[4841]: I0313 09:30:14.933918 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-b2w62"] Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.080347 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a7d41010-f9ab-45b0-9d21-e05037f26651","Type":"ContainerStarted","Data":"97e0820920a2cf9844da5b1f3d09daca9fec0705dd1e7ff19cfebbe11fbb2cb0"} Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.081610 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-bkp2w" event={"ID":"487af2ae-3b25-48b9-a77e-fa1ed59ed8a0","Type":"ContainerDied","Data":"4ab943ed2d6ba707872d36ea47cfd86823cee254a0e8df14a92433f2a5cea1ce"} Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.081676 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bkp2w" Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.094660 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"831e87d6-8c27-4e98-8b3e-e6be93a93e51","Type":"ContainerStarted","Data":"ddabad38c754914b8a59538f3011392a280117d2b445fc3161bba14b3fb2805f"} Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.096073 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6f270332-4a01-403b-8c06-0f8c0bff6527","Type":"ContainerStarted","Data":"1190f6327df593c8b43d5c515c3a5002bbe6ab8ca9a6f88326c69354f0aefef9"} Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.107965 4841 generic.go:334] "Generic (PLEG): container finished" podID="c88febfd-c495-4b2e-a4fb-fca8f447ef9c" containerID="f84151706634bc1d83dcfde710c5f1d8e05733d891fcd2d9b9059fc582b1eca0" exitCode=0 Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.108074 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" event={"ID":"c88febfd-c495-4b2e-a4fb-fca8f447ef9c","Type":"ContainerDied","Data":"f84151706634bc1d83dcfde710c5f1d8e05733d891fcd2d9b9059fc582b1eca0"} Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.112702 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ea6882c8-841d-4ca7-90a9-3d16c4303a58","Type":"ContainerStarted","Data":"a8bf9f4a1d7c5ce6f2b2f79889b5a705194970fffad32e026ce632065f9bc6f3"} Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.117426 4841 generic.go:334] "Generic (PLEG): container finished" podID="ef6ab592-8f4f-4c5c-acad-a3420b519edc" containerID="cd2d819e6a838489ab7e71b3b47261c36e918de5315d6930b5efabbf50ef6c2e" exitCode=0 Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.117532 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" event={"ID":"ef6ab592-8f4f-4c5c-acad-a3420b519edc","Type":"ContainerDied","Data":"cd2d819e6a838489ab7e71b3b47261c36e918de5315d6930b5efabbf50ef6c2e"} Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.128934 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"984ac552-8ac1-4cbf-ada9-10a9dc02acd9","Type":"ContainerStarted","Data":"e9453a3f200860588d4f0a954acf0fa740fb1661b04426b9c065669e20326c9c"} Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.136744 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bkp2w"] Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.145015 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bkp2w"] Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.148543 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"77aa8bf5-4386-4d85-8cca-75c90d5b2593","Type":"ContainerStarted","Data":"cb3da84ad2229fa2342833ef3e52cd1cb9f69a639af6e4481bef3207ad69be5c"} Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.171424 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556570-cc6b9" event={"ID":"72dabbac-c073-43b6-a4fc-7c49b98138c3","Type":"ContainerStarted","Data":"349946f011b65721953c3fc83f439c7790fb16b1d0e16ca7903caa2a31e8b8c2"} Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.173838 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-b2w62" event={"ID":"84f12283-3c15-408e-a1a2-691c257434ca","Type":"ContainerStarted","Data":"c363cbbe03729297c63939b6b175e01fbc97becf9074f265f6abadbd096697cf"} Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.182297 4841 generic.go:334] "Generic (PLEG): container finished" podID="89099fdc-154e-4290-9dbf-31dad846ead6" containerID="1ea0d6b4991f52ac8bb1bd8ef6f5149bd7286c2405f8fa75aeb65885e492d4e3" exitCode=0 Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.182358 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl" event={"ID":"89099fdc-154e-4290-9dbf-31dad846ead6","Type":"ContainerDied","Data":"1ea0d6b4991f52ac8bb1bd8ef6f5149bd7286c2405f8fa75aeb65885e492d4e3"} Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.202839 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bqlfl" event={"ID":"b2bf634d-aa4f-4773-91ee-99616e217c82","Type":"ContainerStarted","Data":"d0334e2088054c50a930818e74150bb4e33571b298c5d768187fd47cc547738c"} Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.204930 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-fzt86" event={"ID":"375ba94d-ec2a-4e94-96fb-2f205f7115b9","Type":"ContainerDied","Data":"da1356a59b2931ea703ef28c1e4e71db31b4ce33b6bb02d5aef208e56ffb6eff"} Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.205034 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fzt86" Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.271129 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fzt86"] Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.277216 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fzt86"] Mar 13 09:30:15 crc kubenswrapper[4841]: E0313 09:30:15.415454 4841 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 13 09:30:15 crc kubenswrapper[4841]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/ef6ab592-8f4f-4c5c-acad-a3420b519edc/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 13 09:30:15 crc kubenswrapper[4841]: > podSandboxID="23a1b499f6eb29fc9ff33fb37b5b2fcd6ab2902542c99300e0a54f4a47de1b42" Mar 13 09:30:15 crc kubenswrapper[4841]: E0313 09:30:15.415835 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 09:30:15 crc kubenswrapper[4841]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p25t9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-h8cgb_openstack(ef6ab592-8f4f-4c5c-acad-a3420b519edc): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/ef6ab592-8f4f-4c5c-acad-a3420b519edc/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 13 09:30:15 crc kubenswrapper[4841]: > logger="UnhandledError" Mar 13 09:30:15 crc kubenswrapper[4841]: E0313 09:30:15.416951 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/ef6ab592-8f4f-4c5c-acad-a3420b519edc/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" podUID="ef6ab592-8f4f-4c5c-acad-a3420b519edc" Mar 13 09:30:15 crc kubenswrapper[4841]: I0313 09:30:15.866049 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 09:30:16 crc kubenswrapper[4841]: I0313 09:30:16.007058 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="375ba94d-ec2a-4e94-96fb-2f205f7115b9" path="/var/lib/kubelet/pods/375ba94d-ec2a-4e94-96fb-2f205f7115b9/volumes" Mar 13 09:30:16 crc kubenswrapper[4841]: I0313 09:30:16.007501 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="487af2ae-3b25-48b9-a77e-fa1ed59ed8a0" path="/var/lib/kubelet/pods/487af2ae-3b25-48b9-a77e-fa1ed59ed8a0/volumes" Mar 13 09:30:16 crc kubenswrapper[4841]: W0313 09:30:16.170555 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f3149e4_fc32_4773_ac07_785c8d11888e.slice/crio-df3329158549ab8f9faeb3b9b31b1d22677134b5833cf740588857fe5438f644 WatchSource:0}: Error finding container df3329158549ab8f9faeb3b9b31b1d22677134b5833cf740588857fe5438f644: Status 404 returned error can't find the container with id df3329158549ab8f9faeb3b9b31b1d22677134b5833cf740588857fe5438f644 Mar 13 09:30:16 crc kubenswrapper[4841]: I0313 09:30:16.214287 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" event={"ID":"c88febfd-c495-4b2e-a4fb-fca8f447ef9c","Type":"ContainerStarted","Data":"324964d9fa4a4b07c58725cbb2e6c1ded9ffd6fff9771209e6d7d078c1ab118e"} Mar 13 09:30:16 crc kubenswrapper[4841]: I0313 09:30:16.214770 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" Mar 13 09:30:16 crc kubenswrapper[4841]: I0313 09:30:16.240592 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" podStartSLOduration=8.208023836 podStartE2EDuration="17.240573606s" podCreationTimestamp="2026-03-13 09:29:59 +0000 UTC" firstStartedPulling="2026-03-13 09:30:05.019677728 +0000 UTC m=+1087.749577919" lastFinishedPulling="2026-03-13 09:30:14.052227498 +0000 UTC m=+1096.782127689" observedRunningTime="2026-03-13 09:30:16.231078173 +0000 UTC m=+1098.960978364" watchObservedRunningTime="2026-03-13 09:30:16.240573606 +0000 UTC m=+1098.970473797" Mar 13 09:30:16 crc kubenswrapper[4841]: I0313 09:30:16.240992 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5f3149e4-fc32-4773-ac07-785c8d11888e","Type":"ContainerStarted","Data":"df3329158549ab8f9faeb3b9b31b1d22677134b5833cf740588857fe5438f644"} Mar 13 09:30:16 crc kubenswrapper[4841]: I0313 09:30:16.245802 4841 generic.go:334] "Generic (PLEG): container finished" podID="72dabbac-c073-43b6-a4fc-7c49b98138c3" containerID="6d2d0eef62732abc02d17c8a69f28ec235599c13005aeaa02128dde3b7345457" exitCode=0 Mar 13 09:30:16 crc kubenswrapper[4841]: I0313 09:30:16.245932 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556570-cc6b9" event={"ID":"72dabbac-c073-43b6-a4fc-7c49b98138c3","Type":"ContainerDied","Data":"6d2d0eef62732abc02d17c8a69f28ec235599c13005aeaa02128dde3b7345457"} Mar 13 09:30:18 crc kubenswrapper[4841]: I0313 09:30:18.775148 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl" Mar 13 09:30:18 crc kubenswrapper[4841]: I0313 09:30:18.787923 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556570-cc6b9" Mar 13 09:30:18 crc kubenswrapper[4841]: I0313 09:30:18.916838 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89099fdc-154e-4290-9dbf-31dad846ead6-config-volume\") pod \"89099fdc-154e-4290-9dbf-31dad846ead6\" (UID: \"89099fdc-154e-4290-9dbf-31dad846ead6\") " Mar 13 09:30:18 crc kubenswrapper[4841]: I0313 09:30:18.916967 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89099fdc-154e-4290-9dbf-31dad846ead6-secret-volume\") pod \"89099fdc-154e-4290-9dbf-31dad846ead6\" (UID: \"89099fdc-154e-4290-9dbf-31dad846ead6\") " Mar 13 09:30:18 crc kubenswrapper[4841]: I0313 09:30:18.917014 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpglz\" (UniqueName: \"kubernetes.io/projected/89099fdc-154e-4290-9dbf-31dad846ead6-kube-api-access-qpglz\") pod \"89099fdc-154e-4290-9dbf-31dad846ead6\" (UID: \"89099fdc-154e-4290-9dbf-31dad846ead6\") " Mar 13 09:30:18 crc kubenswrapper[4841]: I0313 09:30:18.917058 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks68g\" (UniqueName: \"kubernetes.io/projected/72dabbac-c073-43b6-a4fc-7c49b98138c3-kube-api-access-ks68g\") pod \"72dabbac-c073-43b6-a4fc-7c49b98138c3\" (UID: \"72dabbac-c073-43b6-a4fc-7c49b98138c3\") " Mar 13 09:30:18 crc kubenswrapper[4841]: I0313 09:30:18.918415 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89099fdc-154e-4290-9dbf-31dad846ead6-config-volume" (OuterVolumeSpecName: "config-volume") pod "89099fdc-154e-4290-9dbf-31dad846ead6" (UID: "89099fdc-154e-4290-9dbf-31dad846ead6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:18 crc kubenswrapper[4841]: I0313 09:30:18.923687 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72dabbac-c073-43b6-a4fc-7c49b98138c3-kube-api-access-ks68g" (OuterVolumeSpecName: "kube-api-access-ks68g") pod "72dabbac-c073-43b6-a4fc-7c49b98138c3" (UID: "72dabbac-c073-43b6-a4fc-7c49b98138c3"). InnerVolumeSpecName "kube-api-access-ks68g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:30:18 crc kubenswrapper[4841]: I0313 09:30:18.929319 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89099fdc-154e-4290-9dbf-31dad846ead6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "89099fdc-154e-4290-9dbf-31dad846ead6" (UID: "89099fdc-154e-4290-9dbf-31dad846ead6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:30:18 crc kubenswrapper[4841]: I0313 09:30:18.930587 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89099fdc-154e-4290-9dbf-31dad846ead6-kube-api-access-qpglz" (OuterVolumeSpecName: "kube-api-access-qpglz") pod "89099fdc-154e-4290-9dbf-31dad846ead6" (UID: "89099fdc-154e-4290-9dbf-31dad846ead6"). InnerVolumeSpecName "kube-api-access-qpglz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:30:19 crc kubenswrapper[4841]: I0313 09:30:19.019618 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpglz\" (UniqueName: \"kubernetes.io/projected/89099fdc-154e-4290-9dbf-31dad846ead6-kube-api-access-qpglz\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:19 crc kubenswrapper[4841]: I0313 09:30:19.019654 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks68g\" (UniqueName: \"kubernetes.io/projected/72dabbac-c073-43b6-a4fc-7c49b98138c3-kube-api-access-ks68g\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:19 crc kubenswrapper[4841]: I0313 09:30:19.019666 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89099fdc-154e-4290-9dbf-31dad846ead6-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:19 crc kubenswrapper[4841]: I0313 09:30:19.019674 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89099fdc-154e-4290-9dbf-31dad846ead6-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:19 crc kubenswrapper[4841]: I0313 09:30:19.265305 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556570-cc6b9" event={"ID":"72dabbac-c073-43b6-a4fc-7c49b98138c3","Type":"ContainerDied","Data":"349946f011b65721953c3fc83f439c7790fb16b1d0e16ca7903caa2a31e8b8c2"} Mar 13 09:30:19 crc kubenswrapper[4841]: I0313 09:30:19.265605 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="349946f011b65721953c3fc83f439c7790fb16b1d0e16ca7903caa2a31e8b8c2" Mar 13 09:30:19 crc kubenswrapper[4841]: I0313 09:30:19.265325 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556570-cc6b9" Mar 13 09:30:19 crc kubenswrapper[4841]: I0313 09:30:19.275920 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl" event={"ID":"89099fdc-154e-4290-9dbf-31dad846ead6","Type":"ContainerDied","Data":"19f96a0eac91349a71434d111133f2519bef651e993f98f455ab00c2764dd377"} Mar 13 09:30:19 crc kubenswrapper[4841]: I0313 09:30:19.275947 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19f96a0eac91349a71434d111133f2519bef651e993f98f455ab00c2764dd377" Mar 13 09:30:19 crc kubenswrapper[4841]: I0313 09:30:19.275953 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl" Mar 13 09:30:19 crc kubenswrapper[4841]: I0313 09:30:19.857426 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556564-szmt2"] Mar 13 09:30:19 crc kubenswrapper[4841]: I0313 09:30:19.864565 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556564-szmt2"] Mar 13 09:30:20 crc kubenswrapper[4841]: I0313 09:30:20.008597 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c7f9fc4-2398-4146-8d0a-056715def92f" path="/var/lib/kubelet/pods/8c7f9fc4-2398-4146-8d0a-056715def92f/volumes" Mar 13 09:30:24 crc kubenswrapper[4841]: I0313 09:30:24.325326 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"77aa8bf5-4386-4d85-8cca-75c90d5b2593","Type":"ContainerStarted","Data":"34d77494a6ef34bbb24575fc0220442bb1dd63035b7102255b4105f2ded3963a"} Mar 13 09:30:24 crc kubenswrapper[4841]: I0313 09:30:24.334367 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5f3149e4-fc32-4773-ac07-785c8d11888e","Type":"ContainerStarted","Data":"36f37966f4df123fe378e4f264a4715038366b3d9a4ca7051860daca3fab601b"} Mar 13 09:30:24 crc kubenswrapper[4841]: I0313 09:30:24.339436 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"125cd366-c483-4efa-a55f-85b888bf6266","Type":"ContainerStarted","Data":"1774e05f3182aa6dd2cdbf4141160441da46054b4b5b7687543147e56a209bef"} Mar 13 09:30:24 crc kubenswrapper[4841]: I0313 09:30:24.348468 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"831e87d6-8c27-4e98-8b3e-e6be93a93e51","Type":"ContainerStarted","Data":"310e94228faa4e97630a545eb07e1aa263dbd23d4fd17f22a4ca9d15cc4b110d"} Mar 13 09:30:24 crc kubenswrapper[4841]: I0313 09:30:24.349458 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 13 09:30:24 crc kubenswrapper[4841]: I0313 09:30:24.407719 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.162781263 podStartE2EDuration="20.407700491s" podCreationTimestamp="2026-03-13 09:30:04 +0000 UTC" firstStartedPulling="2026-03-13 09:30:14.480647786 +0000 UTC m=+1097.210547967" lastFinishedPulling="2026-03-13 09:30:22.725566974 +0000 UTC m=+1105.455467195" observedRunningTime="2026-03-13 09:30:24.402950555 +0000 UTC m=+1107.132850756" watchObservedRunningTime="2026-03-13 09:30:24.407700491 +0000 UTC m=+1107.137600692" Mar 13 09:30:24 crc kubenswrapper[4841]: I0313 09:30:24.634303 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" Mar 13 09:30:24 crc kubenswrapper[4841]: I0313 09:30:24.720380 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-h8cgb"] Mar 13 09:30:25 crc kubenswrapper[4841]: I0313 09:30:25.360818 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bqlfl" event={"ID":"b2bf634d-aa4f-4773-91ee-99616e217c82","Type":"ContainerStarted","Data":"d5ee2008d83e3311082a663e30aad53401ffe63cb5417548b3f8403faaa8b2b7"} Mar 13 09:30:25 crc kubenswrapper[4841]: I0313 09:30:25.361072 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-bqlfl" Mar 13 09:30:25 crc kubenswrapper[4841]: I0313 09:30:25.365412 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6f270332-4a01-403b-8c06-0f8c0bff6527","Type":"ContainerStarted","Data":"18b82d7d667bf5f2cb4a38afb34a41c960cfe5ff6e84964c33a38c6b1d742611"} Mar 13 09:30:25 crc kubenswrapper[4841]: I0313 09:30:25.368804 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a7d41010-f9ab-45b0-9d21-e05037f26651","Type":"ContainerStarted","Data":"3cad8f83bc789a196e75f2e23f0bb2a62806a9b90fc469a158b6c4565079b82f"} Mar 13 09:30:25 crc kubenswrapper[4841]: I0313 09:30:25.368867 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 09:30:25 crc kubenswrapper[4841]: I0313 09:30:25.381382 4841 generic.go:334] "Generic (PLEG): container finished" podID="84f12283-3c15-408e-a1a2-691c257434ca" containerID="34b25d2c8b17203a099bf589b4c181977e72546e147b9f2418a7628eba62bde5" exitCode=0 Mar 13 09:30:25 crc kubenswrapper[4841]: I0313 09:30:25.381471 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-b2w62" event={"ID":"84f12283-3c15-408e-a1a2-691c257434ca","Type":"ContainerDied","Data":"34b25d2c8b17203a099bf589b4c181977e72546e147b9f2418a7628eba62bde5"} Mar 13 09:30:25 crc kubenswrapper[4841]: I0313 09:30:25.382239 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-bqlfl" podStartSLOduration=8.742786832 podStartE2EDuration="17.382221969s" podCreationTimestamp="2026-03-13 09:30:08 +0000 UTC" firstStartedPulling="2026-03-13 09:30:14.652205469 +0000 UTC m=+1097.382105660" lastFinishedPulling="2026-03-13 09:30:23.291640576 +0000 UTC m=+1106.021540797" observedRunningTime="2026-03-13 09:30:25.375887795 +0000 UTC m=+1108.105788026" watchObservedRunningTime="2026-03-13 09:30:25.382221969 +0000 UTC m=+1108.112122160" Mar 13 09:30:25 crc kubenswrapper[4841]: I0313 09:30:25.384335 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ea6882c8-841d-4ca7-90a9-3d16c4303a58","Type":"ContainerStarted","Data":"723c75ed37f7f534e4700be3916499abdbad4a5a30193aa4fdacace8cd9cf2e3"} Mar 13 09:30:25 crc kubenswrapper[4841]: I0313 09:30:25.386687 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" event={"ID":"ef6ab592-8f4f-4c5c-acad-a3420b519edc","Type":"ContainerStarted","Data":"e03dce9e826c71abbb0411000fc6dfd70092214d86c167963c032639e26566b4"} Mar 13 09:30:25 crc kubenswrapper[4841]: I0313 09:30:25.386790 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" Mar 13 09:30:25 crc kubenswrapper[4841]: I0313 09:30:25.386782 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" podUID="ef6ab592-8f4f-4c5c-acad-a3420b519edc" containerName="dnsmasq-dns" containerID="cri-o://e03dce9e826c71abbb0411000fc6dfd70092214d86c167963c032639e26566b4" gracePeriod=10 Mar 13 09:30:25 crc kubenswrapper[4841]: I0313 09:30:25.398335 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"984ac552-8ac1-4cbf-ada9-10a9dc02acd9","Type":"ContainerStarted","Data":"b4f21ad6ebdb033b78978f8df86585ae63e4e6a9051d4f3e0bb8b1e98e3bc2fa"} Mar 13 09:30:25 crc kubenswrapper[4841]: I0313 09:30:25.420895 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.296746804 podStartE2EDuration="19.42087589s" podCreationTimestamp="2026-03-13 09:30:06 +0000 UTC" firstStartedPulling="2026-03-13 09:30:14.640370204 +0000 UTC m=+1097.370270385" lastFinishedPulling="2026-03-13 09:30:23.76449924 +0000 UTC m=+1106.494399471" observedRunningTime="2026-03-13 09:30:25.391243974 +0000 UTC m=+1108.121144165" watchObservedRunningTime="2026-03-13 09:30:25.42087589 +0000 UTC m=+1108.150776081" Mar 13 09:30:25 crc kubenswrapper[4841]: I0313 09:30:25.471086 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" podStartSLOduration=18.485050211 podStartE2EDuration="27.471066154s" podCreationTimestamp="2026-03-13 09:29:58 +0000 UTC" firstStartedPulling="2026-03-13 09:30:05.032299558 +0000 UTC m=+1087.762199749" lastFinishedPulling="2026-03-13 09:30:14.018315501 +0000 UTC m=+1096.748215692" observedRunningTime="2026-03-13 09:30:25.470836387 +0000 UTC m=+1108.200736598" watchObservedRunningTime="2026-03-13 09:30:25.471066154 +0000 UTC m=+1108.200966345" Mar 13 09:30:26 crc kubenswrapper[4841]: I0313 09:30:26.407370 4841 generic.go:334] "Generic (PLEG): container finished" podID="ef6ab592-8f4f-4c5c-acad-a3420b519edc" containerID="e03dce9e826c71abbb0411000fc6dfd70092214d86c167963c032639e26566b4" exitCode=0 Mar 13 09:30:26 crc kubenswrapper[4841]: I0313 09:30:26.407423 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" event={"ID":"ef6ab592-8f4f-4c5c-acad-a3420b519edc","Type":"ContainerDied","Data":"e03dce9e826c71abbb0411000fc6dfd70092214d86c167963c032639e26566b4"} Mar 13 09:30:29 crc kubenswrapper[4841]: I0313 09:30:29.345568 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" podUID="ef6ab592-8f4f-4c5c-acad-a3420b519edc" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.98:5353: connect: connection refused" Mar 13 09:30:30 crc kubenswrapper[4841]: I0313 09:30:30.448972 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-b2w62" event={"ID":"84f12283-3c15-408e-a1a2-691c257434ca","Type":"ContainerStarted","Data":"980eb41e69de30df7101b0914ed66e570a192410e3fc30a9674207efdc113811"} Mar 13 09:30:30 crc kubenswrapper[4841]: I0313 09:30:30.449772 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-b2w62" event={"ID":"84f12283-3c15-408e-a1a2-691c257434ca","Type":"ContainerStarted","Data":"14c2a7b6ae1cd00985312d291a85ea068465a34a1e461bd188405615738ba941"} Mar 13 09:30:30 crc kubenswrapper[4841]: I0313 09:30:30.449849 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:30 crc kubenswrapper[4841]: I0313 09:30:30.476583 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-b2w62" podStartSLOduration=14.566976308 podStartE2EDuration="22.476565094s" podCreationTimestamp="2026-03-13 09:30:08 +0000 UTC" firstStartedPulling="2026-03-13 09:30:14.943237938 +0000 UTC m=+1097.673138129" lastFinishedPulling="2026-03-13 09:30:22.852826714 +0000 UTC m=+1105.582726915" observedRunningTime="2026-03-13 09:30:30.46988383 +0000 UTC m=+1113.199784051" watchObservedRunningTime="2026-03-13 09:30:30.476565094 +0000 UTC m=+1113.206465285" Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.217805 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.343055 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef6ab592-8f4f-4c5c-acad-a3420b519edc-config\") pod \"ef6ab592-8f4f-4c5c-acad-a3420b519edc\" (UID: \"ef6ab592-8f4f-4c5c-acad-a3420b519edc\") " Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.343316 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p25t9\" (UniqueName: \"kubernetes.io/projected/ef6ab592-8f4f-4c5c-acad-a3420b519edc-kube-api-access-p25t9\") pod \"ef6ab592-8f4f-4c5c-acad-a3420b519edc\" (UID: \"ef6ab592-8f4f-4c5c-acad-a3420b519edc\") " Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.343362 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef6ab592-8f4f-4c5c-acad-a3420b519edc-dns-svc\") pod \"ef6ab592-8f4f-4c5c-acad-a3420b519edc\" (UID: \"ef6ab592-8f4f-4c5c-acad-a3420b519edc\") " Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.349339 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef6ab592-8f4f-4c5c-acad-a3420b519edc-kube-api-access-p25t9" (OuterVolumeSpecName: "kube-api-access-p25t9") pod "ef6ab592-8f4f-4c5c-acad-a3420b519edc" (UID: "ef6ab592-8f4f-4c5c-acad-a3420b519edc"). InnerVolumeSpecName "kube-api-access-p25t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.387159 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef6ab592-8f4f-4c5c-acad-a3420b519edc-config" (OuterVolumeSpecName: "config") pod "ef6ab592-8f4f-4c5c-acad-a3420b519edc" (UID: "ef6ab592-8f4f-4c5c-acad-a3420b519edc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.394532 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef6ab592-8f4f-4c5c-acad-a3420b519edc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef6ab592-8f4f-4c5c-acad-a3420b519edc" (UID: "ef6ab592-8f4f-4c5c-acad-a3420b519edc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.444935 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef6ab592-8f4f-4c5c-acad-a3420b519edc-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.444972 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p25t9\" (UniqueName: \"kubernetes.io/projected/ef6ab592-8f4f-4c5c-acad-a3420b519edc-kube-api-access-p25t9\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.444985 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef6ab592-8f4f-4c5c-acad-a3420b519edc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.460082 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.460082 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-h8cgb" event={"ID":"ef6ab592-8f4f-4c5c-acad-a3420b519edc","Type":"ContainerDied","Data":"23a1b499f6eb29fc9ff33fb37b5b2fcd6ab2902542c99300e0a54f4a47de1b42"} Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.460208 4841 scope.go:117] "RemoveContainer" containerID="e03dce9e826c71abbb0411000fc6dfd70092214d86c167963c032639e26566b4" Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.462575 4841 generic.go:334] "Generic (PLEG): container finished" podID="77aa8bf5-4386-4d85-8cca-75c90d5b2593" containerID="34d77494a6ef34bbb24575fc0220442bb1dd63035b7102255b4105f2ded3963a" exitCode=0 Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.462642 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"77aa8bf5-4386-4d85-8cca-75c90d5b2593","Type":"ContainerDied","Data":"34d77494a6ef34bbb24575fc0220442bb1dd63035b7102255b4105f2ded3963a"} Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.462831 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.496897 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-h8cgb"] Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.504720 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-h8cgb"] Mar 13 09:30:31 crc kubenswrapper[4841]: E0313 09:30:31.607176 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef6ab592_8f4f_4c5c_acad_a3420b519edc.slice\": RecentStats: unable to find data in memory cache]" Mar 13 09:30:31 crc kubenswrapper[4841]: I0313 09:30:31.639726 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 13 09:30:32 crc kubenswrapper[4841]: I0313 09:30:32.007580 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef6ab592-8f4f-4c5c-acad-a3420b519edc" path="/var/lib/kubelet/pods/ef6ab592-8f4f-4c5c-acad-a3420b519edc/volumes" Mar 13 09:30:32 crc kubenswrapper[4841]: I0313 09:30:32.054565 4841 scope.go:117] "RemoveContainer" containerID="cd2d819e6a838489ab7e71b3b47261c36e918de5315d6930b5efabbf50ef6c2e" Mar 13 09:30:32 crc kubenswrapper[4841]: I0313 09:30:32.470828 4841 generic.go:334] "Generic (PLEG): container finished" podID="125cd366-c483-4efa-a55f-85b888bf6266" containerID="1774e05f3182aa6dd2cdbf4141160441da46054b4b5b7687543147e56a209bef" exitCode=0 Mar 13 09:30:32 crc kubenswrapper[4841]: I0313 09:30:32.470904 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"125cd366-c483-4efa-a55f-85b888bf6266","Type":"ContainerDied","Data":"1774e05f3182aa6dd2cdbf4141160441da46054b4b5b7687543147e56a209bef"} Mar 13 09:30:32 crc kubenswrapper[4841]: I0313 09:30:32.473846 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"984ac552-8ac1-4cbf-ada9-10a9dc02acd9","Type":"ContainerStarted","Data":"cb87799fc29bd06fc9dc502edd609042b27561cdb99c46b1d750ec86aab7c185"} Mar 13 09:30:32 crc kubenswrapper[4841]: I0313 09:30:32.475731 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"77aa8bf5-4386-4d85-8cca-75c90d5b2593","Type":"ContainerStarted","Data":"7f498906f31c227a10b7443644a7ce852daa9ce13dbcd2ec600361a8509322f8"} Mar 13 09:30:32 crc kubenswrapper[4841]: I0313 09:30:32.477973 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5f3149e4-fc32-4773-ac07-785c8d11888e","Type":"ContainerStarted","Data":"6ca01eaef2b652605f52e3deb7cfff9ce82db0a6f2f7b6d325d7c4d163f375ab"} Mar 13 09:30:32 crc kubenswrapper[4841]: I0313 09:30:32.543524 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.24066477 podStartE2EDuration="21.543509063s" podCreationTimestamp="2026-03-13 09:30:11 +0000 UTC" firstStartedPulling="2026-03-13 09:30:14.840911181 +0000 UTC m=+1097.570811362" lastFinishedPulling="2026-03-13 09:30:32.143755464 +0000 UTC m=+1114.873655655" observedRunningTime="2026-03-13 09:30:32.524830452 +0000 UTC m=+1115.254730663" watchObservedRunningTime="2026-03-13 09:30:32.543509063 +0000 UTC m=+1115.273409254" Mar 13 09:30:32 crc kubenswrapper[4841]: I0313 09:30:32.546202 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=7.599164218 podStartE2EDuration="23.546183445s" podCreationTimestamp="2026-03-13 09:30:09 +0000 UTC" firstStartedPulling="2026-03-13 09:30:16.173817877 +0000 UTC m=+1098.903718068" lastFinishedPulling="2026-03-13 09:30:32.120837104 +0000 UTC m=+1114.850737295" observedRunningTime="2026-03-13 09:30:32.541106919 +0000 UTC m=+1115.271007110" watchObservedRunningTime="2026-03-13 09:30:32.546183445 +0000 UTC m=+1115.276083636" Mar 13 09:30:32 crc kubenswrapper[4841]: I0313 09:30:32.563608 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.060728507 podStartE2EDuration="30.563590127s" podCreationTimestamp="2026-03-13 09:30:02 +0000 UTC" firstStartedPulling="2026-03-13 09:30:14.350107558 +0000 UTC m=+1097.080007749" lastFinishedPulling="2026-03-13 09:30:22.852969178 +0000 UTC m=+1105.582869369" observedRunningTime="2026-03-13 09:30:32.5584661 +0000 UTC m=+1115.288366291" watchObservedRunningTime="2026-03-13 09:30:32.563590127 +0000 UTC m=+1115.293490318" Mar 13 09:30:33 crc kubenswrapper[4841]: I0313 09:30:33.190237 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:33 crc kubenswrapper[4841]: I0313 09:30:33.479773 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:33 crc kubenswrapper[4841]: I0313 09:30:33.479916 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:33 crc kubenswrapper[4841]: I0313 09:30:33.488166 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"125cd366-c483-4efa-a55f-85b888bf6266","Type":"ContainerStarted","Data":"2f011c7114a10f38583a2d8469b78f4619904cc68081c45f38fc6a019a34bd08"} Mar 13 09:30:33 crc kubenswrapper[4841]: I0313 09:30:33.511965 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=23.354283421 podStartE2EDuration="33.511943555s" podCreationTimestamp="2026-03-13 09:30:00 +0000 UTC" firstStartedPulling="2026-03-13 09:30:12.983534624 +0000 UTC m=+1095.713434815" lastFinishedPulling="2026-03-13 09:30:23.141194718 +0000 UTC m=+1105.871094949" observedRunningTime="2026-03-13 09:30:33.50916042 +0000 UTC m=+1116.239060621" watchObservedRunningTime="2026-03-13 09:30:33.511943555 +0000 UTC m=+1116.241843746" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.189439 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.237884 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.568508 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.637354 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.698312 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.834729 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-l2k2q"] Mar 13 09:30:34 crc kubenswrapper[4841]: E0313 09:30:34.835005 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6ab592-8f4f-4c5c-acad-a3420b519edc" containerName="init" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.835018 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6ab592-8f4f-4c5c-acad-a3420b519edc" containerName="init" Mar 13 09:30:34 crc kubenswrapper[4841]: E0313 09:30:34.835055 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72dabbac-c073-43b6-a4fc-7c49b98138c3" containerName="oc" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.835061 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="72dabbac-c073-43b6-a4fc-7c49b98138c3" containerName="oc" Mar 13 09:30:34 crc kubenswrapper[4841]: E0313 09:30:34.835073 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6ab592-8f4f-4c5c-acad-a3420b519edc" containerName="dnsmasq-dns" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.835079 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6ab592-8f4f-4c5c-acad-a3420b519edc" containerName="dnsmasq-dns" Mar 13 09:30:34 crc kubenswrapper[4841]: E0313 09:30:34.835095 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89099fdc-154e-4290-9dbf-31dad846ead6" containerName="collect-profiles" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.835101 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="89099fdc-154e-4290-9dbf-31dad846ead6" containerName="collect-profiles" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.835256 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="89099fdc-154e-4290-9dbf-31dad846ead6" containerName="collect-profiles" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.835298 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef6ab592-8f4f-4c5c-acad-a3420b519edc" containerName="dnsmasq-dns" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.835311 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="72dabbac-c073-43b6-a4fc-7c49b98138c3" containerName="oc" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.836040 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.839584 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.900160 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-l2k2q"] Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.901047 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b139e10-24dd-44b3-b0f1-f8206907b529-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-l2k2q\" (UID: \"6b139e10-24dd-44b3-b0f1-f8206907b529\") " pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.901093 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b139e10-24dd-44b3-b0f1-f8206907b529-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-l2k2q\" (UID: \"6b139e10-24dd-44b3-b0f1-f8206907b529\") " pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.901111 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssxq7\" (UniqueName: \"kubernetes.io/projected/6b139e10-24dd-44b3-b0f1-f8206907b529-kube-api-access-ssxq7\") pod \"dnsmasq-dns-7fd796d7df-l2k2q\" (UID: \"6b139e10-24dd-44b3-b0f1-f8206907b529\") " pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.901133 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b139e10-24dd-44b3-b0f1-f8206907b529-config\") pod \"dnsmasq-dns-7fd796d7df-l2k2q\" (UID: \"6b139e10-24dd-44b3-b0f1-f8206907b529\") " pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.959134 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-66hkf"] Mar 13 09:30:34 crc kubenswrapper[4841]: I0313 09:30:34.960128 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:34 crc kubenswrapper[4841]: W0313 09:30:34.963458 4841 reflector.go:561] object-"openstack"/"ovncontroller-metrics-config": failed to list *v1.ConfigMap: configmaps "ovncontroller-metrics-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Mar 13 09:30:34 crc kubenswrapper[4841]: E0313 09:30:34.963511 4841 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"ovncontroller-metrics-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovncontroller-metrics-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.005712 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb25dd49-bdd9-46c0-816f-5de963506142-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-66hkf\" (UID: \"fb25dd49-bdd9-46c0-816f-5de963506142\") " pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.005756 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb25dd49-bdd9-46c0-816f-5de963506142-combined-ca-bundle\") pod \"ovn-controller-metrics-66hkf\" (UID: \"fb25dd49-bdd9-46c0-816f-5de963506142\") " pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.005787 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8m25\" (UniqueName: \"kubernetes.io/projected/fb25dd49-bdd9-46c0-816f-5de963506142-kube-api-access-k8m25\") pod \"ovn-controller-metrics-66hkf\" (UID: \"fb25dd49-bdd9-46c0-816f-5de963506142\") " pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.005812 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fb25dd49-bdd9-46c0-816f-5de963506142-ovn-rundir\") pod \"ovn-controller-metrics-66hkf\" (UID: \"fb25dd49-bdd9-46c0-816f-5de963506142\") " pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.005881 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b139e10-24dd-44b3-b0f1-f8206907b529-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-l2k2q\" (UID: \"6b139e10-24dd-44b3-b0f1-f8206907b529\") " pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.005905 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b139e10-24dd-44b3-b0f1-f8206907b529-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-l2k2q\" (UID: \"6b139e10-24dd-44b3-b0f1-f8206907b529\") " pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.005919 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssxq7\" (UniqueName: \"kubernetes.io/projected/6b139e10-24dd-44b3-b0f1-f8206907b529-kube-api-access-ssxq7\") pod \"dnsmasq-dns-7fd796d7df-l2k2q\" (UID: \"6b139e10-24dd-44b3-b0f1-f8206907b529\") " pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.005941 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b139e10-24dd-44b3-b0f1-f8206907b529-config\") pod \"dnsmasq-dns-7fd796d7df-l2k2q\" (UID: \"6b139e10-24dd-44b3-b0f1-f8206907b529\") " pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.005971 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fb25dd49-bdd9-46c0-816f-5de963506142-ovs-rundir\") pod \"ovn-controller-metrics-66hkf\" (UID: \"fb25dd49-bdd9-46c0-816f-5de963506142\") " pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.006000 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb25dd49-bdd9-46c0-816f-5de963506142-config\") pod \"ovn-controller-metrics-66hkf\" (UID: \"fb25dd49-bdd9-46c0-816f-5de963506142\") " pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.006848 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b139e10-24dd-44b3-b0f1-f8206907b529-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-l2k2q\" (UID: \"6b139e10-24dd-44b3-b0f1-f8206907b529\") " pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.007237 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b139e10-24dd-44b3-b0f1-f8206907b529-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-l2k2q\" (UID: \"6b139e10-24dd-44b3-b0f1-f8206907b529\") " pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.007320 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b139e10-24dd-44b3-b0f1-f8206907b529-config\") pod \"dnsmasq-dns-7fd796d7df-l2k2q\" (UID: \"6b139e10-24dd-44b3-b0f1-f8206907b529\") " pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.011478 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-66hkf"] Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.049025 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssxq7\" (UniqueName: \"kubernetes.io/projected/6b139e10-24dd-44b3-b0f1-f8206907b529-kube-api-access-ssxq7\") pod \"dnsmasq-dns-7fd796d7df-l2k2q\" (UID: \"6b139e10-24dd-44b3-b0f1-f8206907b529\") " pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.104010 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-l2k2q"] Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.104705 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.107127 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fb25dd49-bdd9-46c0-816f-5de963506142-ovs-rundir\") pod \"ovn-controller-metrics-66hkf\" (UID: \"fb25dd49-bdd9-46c0-816f-5de963506142\") " pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.107192 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb25dd49-bdd9-46c0-816f-5de963506142-config\") pod \"ovn-controller-metrics-66hkf\" (UID: \"fb25dd49-bdd9-46c0-816f-5de963506142\") " pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.107259 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb25dd49-bdd9-46c0-816f-5de963506142-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-66hkf\" (UID: \"fb25dd49-bdd9-46c0-816f-5de963506142\") " pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.107305 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb25dd49-bdd9-46c0-816f-5de963506142-combined-ca-bundle\") pod \"ovn-controller-metrics-66hkf\" (UID: \"fb25dd49-bdd9-46c0-816f-5de963506142\") " pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.107331 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8m25\" (UniqueName: \"kubernetes.io/projected/fb25dd49-bdd9-46c0-816f-5de963506142-kube-api-access-k8m25\") pod \"ovn-controller-metrics-66hkf\" (UID: \"fb25dd49-bdd9-46c0-816f-5de963506142\") " pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.107350 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fb25dd49-bdd9-46c0-816f-5de963506142-ovn-rundir\") pod \"ovn-controller-metrics-66hkf\" (UID: \"fb25dd49-bdd9-46c0-816f-5de963506142\") " pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.107645 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fb25dd49-bdd9-46c0-816f-5de963506142-ovn-rundir\") pod \"ovn-controller-metrics-66hkf\" (UID: \"fb25dd49-bdd9-46c0-816f-5de963506142\") " pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.108519 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fb25dd49-bdd9-46c0-816f-5de963506142-ovs-rundir\") pod \"ovn-controller-metrics-66hkf\" (UID: \"fb25dd49-bdd9-46c0-816f-5de963506142\") " pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.123800 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb25dd49-bdd9-46c0-816f-5de963506142-combined-ca-bundle\") pod \"ovn-controller-metrics-66hkf\" (UID: \"fb25dd49-bdd9-46c0-816f-5de963506142\") " pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.128500 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-64ds6"] Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.128751 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb25dd49-bdd9-46c0-816f-5de963506142-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-66hkf\" (UID: \"fb25dd49-bdd9-46c0-816f-5de963506142\") " pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.133961 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8m25\" (UniqueName: \"kubernetes.io/projected/fb25dd49-bdd9-46c0-816f-5de963506142-kube-api-access-k8m25\") pod \"ovn-controller-metrics-66hkf\" (UID: \"fb25dd49-bdd9-46c0-816f-5de963506142\") " pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.139165 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.142561 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.155982 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-64ds6"] Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.209138 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-64ds6\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.209514 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-64ds6\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.209650 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-884v7\" (UniqueName: \"kubernetes.io/projected/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-kube-api-access-884v7\") pod \"dnsmasq-dns-86db49b7ff-64ds6\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.209760 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-config\") pod \"dnsmasq-dns-86db49b7ff-64ds6\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.209874 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-64ds6\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.313087 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-64ds6\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.313139 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-884v7\" (UniqueName: \"kubernetes.io/projected/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-kube-api-access-884v7\") pod \"dnsmasq-dns-86db49b7ff-64ds6\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.313165 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-config\") pod \"dnsmasq-dns-86db49b7ff-64ds6\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.313185 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-64ds6\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.313379 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-64ds6\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.314409 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-64ds6\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.314999 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-config\") pod \"dnsmasq-dns-86db49b7ff-64ds6\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.315510 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-64ds6\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.317405 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-64ds6\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.337865 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-884v7\" (UniqueName: \"kubernetes.io/projected/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-kube-api-access-884v7\") pod \"dnsmasq-dns-86db49b7ff-64ds6\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.504658 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.543602 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.545872 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.623860 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-l2k2q"] Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.735824 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.741863 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.763442 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.763703 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.763836 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.764591 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-m4szq" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.769036 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.825602 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.825663 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.825702 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.825721 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-scripts\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.825745 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.825770 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-config\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.825812 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvhqv\" (UniqueName: \"kubernetes.io/projected/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-kube-api-access-mvhqv\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.918782 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.927328 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-config\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.927410 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvhqv\" (UniqueName: \"kubernetes.io/projected/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-kube-api-access-mvhqv\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.927478 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.927521 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.927571 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.927598 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-scripts\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.927629 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.928484 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-config\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.928828 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.928889 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-scripts\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.929181 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb25dd49-bdd9-46c0-816f-5de963506142-config\") pod \"ovn-controller-metrics-66hkf\" (UID: \"fb25dd49-bdd9-46c0-816f-5de963506142\") " pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.931542 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.931835 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.937299 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:35 crc kubenswrapper[4841]: I0313 09:30:35.943648 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvhqv\" (UniqueName: \"kubernetes.io/projected/32dcacb9-78d3-4dd5-95e4-6d069bddc9e3-kube-api-access-mvhqv\") pod \"ovn-northd-0\" (UID: \"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3\") " pod="openstack/ovn-northd-0" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.096159 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.179529 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-66hkf" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.199013 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-64ds6"] Mar 13 09:30:36 crc kubenswrapper[4841]: W0313 09:30:36.215815 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae9cfa21_7eab_4c3a_a1ed_a0961f50b96e.slice/crio-058b7c2a8c0279f31646915c15960eb156c406756587a3804e02a1cd6b14cf6e WatchSource:0}: Error finding container 058b7c2a8c0279f31646915c15960eb156c406756587a3804e02a1cd6b14cf6e: Status 404 returned error can't find the container with id 058b7c2a8c0279f31646915c15960eb156c406756587a3804e02a1cd6b14cf6e Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.504793 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.515726 4841 generic.go:334] "Generic (PLEG): container finished" podID="6b139e10-24dd-44b3-b0f1-f8206907b529" containerID="53d8b2cd85d7d3d55f76e936566ce6187b6dc913b757b2712c362b9937189a23" exitCode=0 Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.515784 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" event={"ID":"6b139e10-24dd-44b3-b0f1-f8206907b529","Type":"ContainerDied","Data":"53d8b2cd85d7d3d55f76e936566ce6187b6dc913b757b2712c362b9937189a23"} Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.515808 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" event={"ID":"6b139e10-24dd-44b3-b0f1-f8206907b529","Type":"ContainerStarted","Data":"78241ab1a38fea870a8cbae23d2260ab455abca6a092b84b1a0ba23aa95ad039"} Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.517448 4841 generic.go:334] "Generic (PLEG): container finished" podID="ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e" containerID="4ef21be765542894d9eff33a97013abfb5187e51c4ec40d704394d8e26c1ee3c" exitCode=0 Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.518413 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" event={"ID":"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e","Type":"ContainerDied","Data":"4ef21be765542894d9eff33a97013abfb5187e51c4ec40d704394d8e26c1ee3c"} Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.518477 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" event={"ID":"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e","Type":"ContainerStarted","Data":"058b7c2a8c0279f31646915c15960eb156c406756587a3804e02a1cd6b14cf6e"} Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.520219 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-64ds6"] Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.579087 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.628353 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-2g4hc"] Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.630730 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.678051 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2g4hc"] Mar 13 09:30:36 crc kubenswrapper[4841]: W0313 09:30:36.678289 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb25dd49_bdd9_46c0_816f_5de963506142.slice/crio-e86334fb18a1aacca4b547a452fe4be29b0b92eaa68b0cbabb828bb6ad88921b WatchSource:0}: Error finding container e86334fb18a1aacca4b547a452fe4be29b0b92eaa68b0cbabb828bb6ad88921b: Status 404 returned error can't find the container with id e86334fb18a1aacca4b547a452fe4be29b0b92eaa68b0cbabb828bb6ad88921b Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.687979 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-66hkf"] Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.761826 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-2g4hc\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.761870 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-config\") pod \"dnsmasq-dns-698758b865-2g4hc\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.761890 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7j9v\" (UniqueName: \"kubernetes.io/projected/900f1d10-4a56-461f-81cc-caea5f1b88c8-kube-api-access-p7j9v\") pod \"dnsmasq-dns-698758b865-2g4hc\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.761963 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-2g4hc\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.762010 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-dns-svc\") pod \"dnsmasq-dns-698758b865-2g4hc\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:36 crc kubenswrapper[4841]: E0313 09:30:36.832678 4841 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 13 09:30:36 crc kubenswrapper[4841]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 13 09:30:36 crc kubenswrapper[4841]: > podSandboxID="058b7c2a8c0279f31646915c15960eb156c406756587a3804e02a1cd6b14cf6e" Mar 13 09:30:36 crc kubenswrapper[4841]: E0313 09:30:36.833083 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 09:30:36 crc kubenswrapper[4841]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n599h5cbh7ch5d4h66fh676hdbh546h95h88h5ffh55ch7fhch57ch687hddhc7h5fdh57dh674h56fh64ch98h9bh557h55dh646h54ch54fh5c4h597q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-884v7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86db49b7ff-64ds6_openstack(ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 13 09:30:36 crc kubenswrapper[4841]: > logger="UnhandledError" Mar 13 09:30:36 crc kubenswrapper[4841]: E0313 09:30:36.834172 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" podUID="ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.863183 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-dns-svc\") pod \"dnsmasq-dns-698758b865-2g4hc\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.863320 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-2g4hc\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.863351 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-config\") pod \"dnsmasq-dns-698758b865-2g4hc\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.863372 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7j9v\" (UniqueName: \"kubernetes.io/projected/900f1d10-4a56-461f-81cc-caea5f1b88c8-kube-api-access-p7j9v\") pod \"dnsmasq-dns-698758b865-2g4hc\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.863435 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-2g4hc\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.864315 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-2g4hc\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.864960 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-dns-svc\") pod \"dnsmasq-dns-698758b865-2g4hc\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.865402 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-config\") pod \"dnsmasq-dns-698758b865-2g4hc\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.866010 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-2g4hc\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.882767 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7j9v\" (UniqueName: \"kubernetes.io/projected/900f1d10-4a56-461f-81cc-caea5f1b88c8-kube-api-access-p7j9v\") pod \"dnsmasq-dns-698758b865-2g4hc\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.901752 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" Mar 13 09:30:36 crc kubenswrapper[4841]: I0313 09:30:36.982147 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.067821 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b139e10-24dd-44b3-b0f1-f8206907b529-ovsdbserver-nb\") pod \"6b139e10-24dd-44b3-b0f1-f8206907b529\" (UID: \"6b139e10-24dd-44b3-b0f1-f8206907b529\") " Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.067888 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b139e10-24dd-44b3-b0f1-f8206907b529-config\") pod \"6b139e10-24dd-44b3-b0f1-f8206907b529\" (UID: \"6b139e10-24dd-44b3-b0f1-f8206907b529\") " Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.068037 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b139e10-24dd-44b3-b0f1-f8206907b529-dns-svc\") pod \"6b139e10-24dd-44b3-b0f1-f8206907b529\" (UID: \"6b139e10-24dd-44b3-b0f1-f8206907b529\") " Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.068124 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssxq7\" (UniqueName: \"kubernetes.io/projected/6b139e10-24dd-44b3-b0f1-f8206907b529-kube-api-access-ssxq7\") pod \"6b139e10-24dd-44b3-b0f1-f8206907b529\" (UID: \"6b139e10-24dd-44b3-b0f1-f8206907b529\") " Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.075004 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b139e10-24dd-44b3-b0f1-f8206907b529-kube-api-access-ssxq7" (OuterVolumeSpecName: "kube-api-access-ssxq7") pod "6b139e10-24dd-44b3-b0f1-f8206907b529" (UID: "6b139e10-24dd-44b3-b0f1-f8206907b529"). InnerVolumeSpecName "kube-api-access-ssxq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.088917 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b139e10-24dd-44b3-b0f1-f8206907b529-config" (OuterVolumeSpecName: "config") pod "6b139e10-24dd-44b3-b0f1-f8206907b529" (UID: "6b139e10-24dd-44b3-b0f1-f8206907b529"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.089772 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b139e10-24dd-44b3-b0f1-f8206907b529-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6b139e10-24dd-44b3-b0f1-f8206907b529" (UID: "6b139e10-24dd-44b3-b0f1-f8206907b529"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.095499 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b139e10-24dd-44b3-b0f1-f8206907b529-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6b139e10-24dd-44b3-b0f1-f8206907b529" (UID: "6b139e10-24dd-44b3-b0f1-f8206907b529"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.170944 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b139e10-24dd-44b3-b0f1-f8206907b529-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.170978 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b139e10-24dd-44b3-b0f1-f8206907b529-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.170991 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b139e10-24dd-44b3-b0f1-f8206907b529-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.171005 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssxq7\" (UniqueName: \"kubernetes.io/projected/6b139e10-24dd-44b3-b0f1-f8206907b529-kube-api-access-ssxq7\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.400025 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2g4hc"] Mar 13 09:30:37 crc kubenswrapper[4841]: W0313 09:30:37.408592 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod900f1d10_4a56_461f_81cc_caea5f1b88c8.slice/crio-a6dedda5a1785c6e443bedeb940f4cd9369de74476c9d7095912967fd5410328 WatchSource:0}: Error finding container a6dedda5a1785c6e443bedeb940f4cd9369de74476c9d7095912967fd5410328: Status 404 returned error can't find the container with id a6dedda5a1785c6e443bedeb940f4cd9369de74476c9d7095912967fd5410328 Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.531870 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2g4hc" event={"ID":"900f1d10-4a56-461f-81cc-caea5f1b88c8","Type":"ContainerStarted","Data":"a6dedda5a1785c6e443bedeb940f4cd9369de74476c9d7095912967fd5410328"} Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.535160 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-66hkf" event={"ID":"fb25dd49-bdd9-46c0-816f-5de963506142","Type":"ContainerStarted","Data":"618e4679c3b51e983097ee2062d050358fe7929d1120be57c24a5bd80cfaa368"} Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.535710 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-66hkf" event={"ID":"fb25dd49-bdd9-46c0-816f-5de963506142","Type":"ContainerStarted","Data":"e86334fb18a1aacca4b547a452fe4be29b0b92eaa68b0cbabb828bb6ad88921b"} Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.537058 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3","Type":"ContainerStarted","Data":"a6e347201c6c6f8f11f480e4ceecab8ff2fbf4ad34fc5de9c6612f844c3634e8"} Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.546431 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" event={"ID":"6b139e10-24dd-44b3-b0f1-f8206907b529","Type":"ContainerDied","Data":"78241ab1a38fea870a8cbae23d2260ab455abca6a092b84b1a0ba23aa95ad039"} Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.546499 4841 scope.go:117] "RemoveContainer" containerID="53d8b2cd85d7d3d55f76e936566ce6187b6dc913b757b2712c362b9937189a23" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.547164 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-l2k2q" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.621574 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-66hkf" podStartSLOduration=3.6215532809999997 podStartE2EDuration="3.621553281s" podCreationTimestamp="2026-03-13 09:30:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:30:37.549694114 +0000 UTC m=+1120.279594305" watchObservedRunningTime="2026-03-13 09:30:37.621553281 +0000 UTC m=+1120.351453472" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.670166 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-l2k2q"] Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.679438 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-l2k2q"] Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.686917 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 13 09:30:37 crc kubenswrapper[4841]: E0313 09:30:37.687378 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b139e10-24dd-44b3-b0f1-f8206907b529" containerName="init" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.687403 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b139e10-24dd-44b3-b0f1-f8206907b529" containerName="init" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.687609 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b139e10-24dd-44b3-b0f1-f8206907b529" containerName="init" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.692479 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.695939 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2qq9v" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.697138 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.697608 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.699309 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.721458 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.807809 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.807862 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64f62ec7-7a91-458c-86cb-7658544e4a51-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.807892 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.807915 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/64f62ec7-7a91-458c-86cb-7658544e4a51-cache\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.807937 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/64f62ec7-7a91-458c-86cb-7658544e4a51-lock\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.807981 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h2x2\" (UniqueName: \"kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-kube-api-access-4h2x2\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.909768 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h2x2\" (UniqueName: \"kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-kube-api-access-4h2x2\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.909888 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.909928 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64f62ec7-7a91-458c-86cb-7658544e4a51-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.909955 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.909984 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/64f62ec7-7a91-458c-86cb-7658544e4a51-cache\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.910013 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/64f62ec7-7a91-458c-86cb-7658544e4a51-lock\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.910568 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/64f62ec7-7a91-458c-86cb-7658544e4a51-lock\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:37 crc kubenswrapper[4841]: E0313 09:30:37.910608 4841 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 09:30:37 crc kubenswrapper[4841]: E0313 09:30:37.910626 4841 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 09:30:37 crc kubenswrapper[4841]: E0313 09:30:37.910664 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift podName:64f62ec7-7a91-458c-86cb-7658544e4a51 nodeName:}" failed. No retries permitted until 2026-03-13 09:30:38.410650807 +0000 UTC m=+1121.140550998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift") pod "swift-storage-0" (UID: "64f62ec7-7a91-458c-86cb-7658544e4a51") : configmap "swift-ring-files" not found Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.910975 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.912390 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/64f62ec7-7a91-458c-86cb-7658544e4a51-cache\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.918049 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64f62ec7-7a91-458c-86cb-7658544e4a51-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.926077 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h2x2\" (UniqueName: \"kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-kube-api-access-4h2x2\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:37 crc kubenswrapper[4841]: I0313 09:30:37.932570 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.012397 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b139e10-24dd-44b3-b0f1-f8206907b529" path="/var/lib/kubelet/pods/6b139e10-24dd-44b3-b0f1-f8206907b529/volumes" Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.421134 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:38 crc kubenswrapper[4841]: E0313 09:30:38.421313 4841 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 09:30:38 crc kubenswrapper[4841]: E0313 09:30:38.421342 4841 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 09:30:38 crc kubenswrapper[4841]: E0313 09:30:38.421407 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift podName:64f62ec7-7a91-458c-86cb-7658544e4a51 nodeName:}" failed. No retries permitted until 2026-03-13 09:30:39.421384578 +0000 UTC m=+1122.151284769 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift") pod "swift-storage-0" (UID: "64f62ec7-7a91-458c-86cb-7658544e4a51") : configmap "swift-ring-files" not found Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.479655 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.554746 4841 generic.go:334] "Generic (PLEG): container finished" podID="900f1d10-4a56-461f-81cc-caea5f1b88c8" containerID="0cfaa38915f28a779a65cadb53ed4cc3bfa3be02817bb08c8e2a4adf625aee83" exitCode=0 Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.554802 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2g4hc" event={"ID":"900f1d10-4a56-461f-81cc-caea5f1b88c8","Type":"ContainerDied","Data":"0cfaa38915f28a779a65cadb53ed4cc3bfa3be02817bb08c8e2a4adf625aee83"} Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.557745 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" event={"ID":"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e","Type":"ContainerDied","Data":"058b7c2a8c0279f31646915c15960eb156c406756587a3804e02a1cd6b14cf6e"} Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.557788 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-64ds6" Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.557810 4841 scope.go:117] "RemoveContainer" containerID="4ef21be765542894d9eff33a97013abfb5187e51c4ec40d704394d8e26c1ee3c" Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.624361 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-884v7\" (UniqueName: \"kubernetes.io/projected/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-kube-api-access-884v7\") pod \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.625136 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-ovsdbserver-sb\") pod \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.625431 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-ovsdbserver-nb\") pod \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.625505 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-dns-svc\") pod \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.625568 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-config\") pod \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\" (UID: \"ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e\") " Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.629781 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-kube-api-access-884v7" (OuterVolumeSpecName: "kube-api-access-884v7") pod "ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e" (UID: "ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e"). InnerVolumeSpecName "kube-api-access-884v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.665323 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e" (UID: "ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.671047 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e" (UID: "ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.673071 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e" (UID: "ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.687150 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-config" (OuterVolumeSpecName: "config") pod "ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e" (UID: "ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.727884 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.727919 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.727932 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-884v7\" (UniqueName: \"kubernetes.io/projected/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-kube-api-access-884v7\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.727945 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.727958 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.912942 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-64ds6"] Mar 13 09:30:38 crc kubenswrapper[4841]: I0313 09:30:38.920753 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-64ds6"] Mar 13 09:30:39 crc kubenswrapper[4841]: I0313 09:30:39.439142 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:39 crc kubenswrapper[4841]: E0313 09:30:39.439444 4841 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 09:30:39 crc kubenswrapper[4841]: E0313 09:30:39.439483 4841 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 09:30:39 crc kubenswrapper[4841]: E0313 09:30:39.439563 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift podName:64f62ec7-7a91-458c-86cb-7658544e4a51 nodeName:}" failed. No retries permitted until 2026-03-13 09:30:41.439530289 +0000 UTC m=+1124.169430500 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift") pod "swift-storage-0" (UID: "64f62ec7-7a91-458c-86cb-7658544e4a51") : configmap "swift-ring-files" not found Mar 13 09:30:39 crc kubenswrapper[4841]: I0313 09:30:39.567505 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2g4hc" event={"ID":"900f1d10-4a56-461f-81cc-caea5f1b88c8","Type":"ContainerStarted","Data":"cee69fe8c28d2b81682e9ba0d48495a9116e0d5a3d1c6c11623b5e353e338ddb"} Mar 13 09:30:39 crc kubenswrapper[4841]: I0313 09:30:39.567579 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:39 crc kubenswrapper[4841]: I0313 09:30:39.570887 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3","Type":"ContainerStarted","Data":"06b7fd8b825b6e08fcaee0e66e75002f7c2342192df381d7c7b3da34f30327ea"} Mar 13 09:30:39 crc kubenswrapper[4841]: I0313 09:30:39.570926 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"32dcacb9-78d3-4dd5-95e4-6d069bddc9e3","Type":"ContainerStarted","Data":"6ccfbff0e63ff1a7dd5ef25907aea16679ffdd82bcb5059aee44b403cb4be0ca"} Mar 13 09:30:39 crc kubenswrapper[4841]: I0313 09:30:39.571187 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 13 09:30:39 crc kubenswrapper[4841]: I0313 09:30:39.586548 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-2g4hc" podStartSLOduration=3.586532403 podStartE2EDuration="3.586532403s" podCreationTimestamp="2026-03-13 09:30:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:30:39.584937904 +0000 UTC m=+1122.314838105" watchObservedRunningTime="2026-03-13 09:30:39.586532403 +0000 UTC m=+1122.316432594" Mar 13 09:30:39 crc kubenswrapper[4841]: I0313 09:30:39.607396 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.859676178 podStartE2EDuration="4.607373609s" podCreationTimestamp="2026-03-13 09:30:35 +0000 UTC" firstStartedPulling="2026-03-13 09:30:36.632375925 +0000 UTC m=+1119.362276116" lastFinishedPulling="2026-03-13 09:30:38.380073356 +0000 UTC m=+1121.109973547" observedRunningTime="2026-03-13 09:30:39.602940654 +0000 UTC m=+1122.332840865" watchObservedRunningTime="2026-03-13 09:30:39.607373609 +0000 UTC m=+1122.337273810" Mar 13 09:30:40 crc kubenswrapper[4841]: I0313 09:30:40.004873 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e" path="/var/lib/kubelet/pods/ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e/volumes" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.474483 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:41 crc kubenswrapper[4841]: E0313 09:30:41.474692 4841 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 09:30:41 crc kubenswrapper[4841]: E0313 09:30:41.475587 4841 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 09:30:41 crc kubenswrapper[4841]: E0313 09:30:41.475657 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift podName:64f62ec7-7a91-458c-86cb-7658544e4a51 nodeName:}" failed. No retries permitted until 2026-03-13 09:30:45.475637676 +0000 UTC m=+1128.205537867 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift") pod "swift-storage-0" (UID: "64f62ec7-7a91-458c-86cb-7658544e4a51") : configmap "swift-ring-files" not found Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.625521 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.692313 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-sqrfd"] Mar 13 09:30:41 crc kubenswrapper[4841]: E0313 09:30:41.695178 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e" containerName="init" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.695208 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e" containerName="init" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.695408 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae9cfa21-7eab-4c3a-a1ed-a0961f50b96e" containerName="init" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.696064 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.698671 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.699024 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.701786 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.708055 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-sqrfd"] Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.775916 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.780525 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f66d8c2c-71a2-4927-a708-4b1412d0243c-swiftconf\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.780605 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f66d8c2c-71a2-4927-a708-4b1412d0243c-ring-data-devices\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.780681 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl6zh\" (UniqueName: \"kubernetes.io/projected/f66d8c2c-71a2-4927-a708-4b1412d0243c-kube-api-access-pl6zh\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.780717 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f66d8c2c-71a2-4927-a708-4b1412d0243c-scripts\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.780749 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66d8c2c-71a2-4927-a708-4b1412d0243c-combined-ca-bundle\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.780795 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f66d8c2c-71a2-4927-a708-4b1412d0243c-etc-swift\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.780818 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f66d8c2c-71a2-4927-a708-4b1412d0243c-dispersionconf\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.888285 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl6zh\" (UniqueName: \"kubernetes.io/projected/f66d8c2c-71a2-4927-a708-4b1412d0243c-kube-api-access-pl6zh\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.888361 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f66d8c2c-71a2-4927-a708-4b1412d0243c-scripts\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.888395 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66d8c2c-71a2-4927-a708-4b1412d0243c-combined-ca-bundle\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.888447 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f66d8c2c-71a2-4927-a708-4b1412d0243c-etc-swift\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.888480 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f66d8c2c-71a2-4927-a708-4b1412d0243c-dispersionconf\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.888523 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f66d8c2c-71a2-4927-a708-4b1412d0243c-swiftconf\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.888575 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f66d8c2c-71a2-4927-a708-4b1412d0243c-ring-data-devices\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.889227 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f66d8c2c-71a2-4927-a708-4b1412d0243c-scripts\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.889244 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f66d8c2c-71a2-4927-a708-4b1412d0243c-ring-data-devices\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.889477 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f66d8c2c-71a2-4927-a708-4b1412d0243c-etc-swift\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.893975 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f66d8c2c-71a2-4927-a708-4b1412d0243c-swiftconf\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.906013 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl6zh\" (UniqueName: \"kubernetes.io/projected/f66d8c2c-71a2-4927-a708-4b1412d0243c-kube-api-access-pl6zh\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.908081 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66d8c2c-71a2-4927-a708-4b1412d0243c-combined-ca-bundle\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:41 crc kubenswrapper[4841]: I0313 09:30:41.912516 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f66d8c2c-71a2-4927-a708-4b1412d0243c-dispersionconf\") pod \"swift-ring-rebalance-sqrfd\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.015439 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.072459 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.072748 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.180392 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.212253 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-p7rfs"] Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.213696 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p7rfs" Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.217860 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.236669 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-p7rfs"] Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.295419 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k967\" (UniqueName: \"kubernetes.io/projected/127599cf-60ed-4009-a303-d1071188fec4-kube-api-access-9k967\") pod \"root-account-create-update-p7rfs\" (UID: \"127599cf-60ed-4009-a303-d1071188fec4\") " pod="openstack/root-account-create-update-p7rfs" Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.295462 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/127599cf-60ed-4009-a303-d1071188fec4-operator-scripts\") pod \"root-account-create-update-p7rfs\" (UID: \"127599cf-60ed-4009-a303-d1071188fec4\") " pod="openstack/root-account-create-update-p7rfs" Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.397448 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k967\" (UniqueName: \"kubernetes.io/projected/127599cf-60ed-4009-a303-d1071188fec4-kube-api-access-9k967\") pod \"root-account-create-update-p7rfs\" (UID: \"127599cf-60ed-4009-a303-d1071188fec4\") " pod="openstack/root-account-create-update-p7rfs" Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.397507 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/127599cf-60ed-4009-a303-d1071188fec4-operator-scripts\") pod \"root-account-create-update-p7rfs\" (UID: \"127599cf-60ed-4009-a303-d1071188fec4\") " pod="openstack/root-account-create-update-p7rfs" Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.398357 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/127599cf-60ed-4009-a303-d1071188fec4-operator-scripts\") pod \"root-account-create-update-p7rfs\" (UID: \"127599cf-60ed-4009-a303-d1071188fec4\") " pod="openstack/root-account-create-update-p7rfs" Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.419442 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k967\" (UniqueName: \"kubernetes.io/projected/127599cf-60ed-4009-a303-d1071188fec4-kube-api-access-9k967\") pod \"root-account-create-update-p7rfs\" (UID: \"127599cf-60ed-4009-a303-d1071188fec4\") " pod="openstack/root-account-create-update-p7rfs" Mar 13 09:30:42 crc kubenswrapper[4841]: W0313 09:30:42.487763 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf66d8c2c_71a2_4927_a708_4b1412d0243c.slice/crio-0426ee793b4d9c6ff6c8d7390e3559713c9bd3cadcf72fb0446951bb98b6e7aa WatchSource:0}: Error finding container 0426ee793b4d9c6ff6c8d7390e3559713c9bd3cadcf72fb0446951bb98b6e7aa: Status 404 returned error can't find the container with id 0426ee793b4d9c6ff6c8d7390e3559713c9bd3cadcf72fb0446951bb98b6e7aa Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.491012 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-sqrfd"] Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.537795 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p7rfs" Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.596182 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-sqrfd" event={"ID":"f66d8c2c-71a2-4927-a708-4b1412d0243c","Type":"ContainerStarted","Data":"0426ee793b4d9c6ff6c8d7390e3559713c9bd3cadcf72fb0446951bb98b6e7aa"} Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.696355 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 13 09:30:42 crc kubenswrapper[4841]: I0313 09:30:42.971746 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-p7rfs"] Mar 13 09:30:42 crc kubenswrapper[4841]: W0313 09:30:42.978507 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod127599cf_60ed_4009_a303_d1071188fec4.slice/crio-dbbfee5514528139bd483e2235ff7bac31abb5f4ab04aed30770e6528c1fa716 WatchSource:0}: Error finding container dbbfee5514528139bd483e2235ff7bac31abb5f4ab04aed30770e6528c1fa716: Status 404 returned error can't find the container with id dbbfee5514528139bd483e2235ff7bac31abb5f4ab04aed30770e6528c1fa716 Mar 13 09:30:43 crc kubenswrapper[4841]: I0313 09:30:43.607009 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-p7rfs" event={"ID":"127599cf-60ed-4009-a303-d1071188fec4","Type":"ContainerDied","Data":"1d18fee47ef8a12f0ed4e312e138d69c819036bfe89803414f2e9ebb14fb7861"} Mar 13 09:30:43 crc kubenswrapper[4841]: I0313 09:30:43.607743 4841 generic.go:334] "Generic (PLEG): container finished" podID="127599cf-60ed-4009-a303-d1071188fec4" containerID="1d18fee47ef8a12f0ed4e312e138d69c819036bfe89803414f2e9ebb14fb7861" exitCode=0 Mar 13 09:30:43 crc kubenswrapper[4841]: I0313 09:30:43.607901 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-p7rfs" event={"ID":"127599cf-60ed-4009-a303-d1071188fec4","Type":"ContainerStarted","Data":"dbbfee5514528139bd483e2235ff7bac31abb5f4ab04aed30770e6528c1fa716"} Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.278210 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-ptsql"] Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.279888 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ptsql" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.286341 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ptsql"] Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.348366 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc49445-1a38-4ac6-91c8-822d298116a3-operator-scripts\") pod \"keystone-db-create-ptsql\" (UID: \"7bc49445-1a38-4ac6-91c8-822d298116a3\") " pod="openstack/keystone-db-create-ptsql" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.348830 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfjzg\" (UniqueName: \"kubernetes.io/projected/7bc49445-1a38-4ac6-91c8-822d298116a3-kube-api-access-rfjzg\") pod \"keystone-db-create-ptsql\" (UID: \"7bc49445-1a38-4ac6-91c8-822d298116a3\") " pod="openstack/keystone-db-create-ptsql" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.388635 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9d81-account-create-update-kddpx"] Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.390072 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9d81-account-create-update-kddpx" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.393437 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.397085 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9d81-account-create-update-kddpx"] Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.450260 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfjzg\" (UniqueName: \"kubernetes.io/projected/7bc49445-1a38-4ac6-91c8-822d298116a3-kube-api-access-rfjzg\") pod \"keystone-db-create-ptsql\" (UID: \"7bc49445-1a38-4ac6-91c8-822d298116a3\") " pod="openstack/keystone-db-create-ptsql" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.450381 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc49445-1a38-4ac6-91c8-822d298116a3-operator-scripts\") pod \"keystone-db-create-ptsql\" (UID: \"7bc49445-1a38-4ac6-91c8-822d298116a3\") " pod="openstack/keystone-db-create-ptsql" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.450493 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-959k2\" (UniqueName: \"kubernetes.io/projected/bc60d2f3-4d25-4e39-b7eb-70fb4e750774-kube-api-access-959k2\") pod \"keystone-9d81-account-create-update-kddpx\" (UID: \"bc60d2f3-4d25-4e39-b7eb-70fb4e750774\") " pod="openstack/keystone-9d81-account-create-update-kddpx" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.450726 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc60d2f3-4d25-4e39-b7eb-70fb4e750774-operator-scripts\") pod \"keystone-9d81-account-create-update-kddpx\" (UID: \"bc60d2f3-4d25-4e39-b7eb-70fb4e750774\") " pod="openstack/keystone-9d81-account-create-update-kddpx" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.451584 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc49445-1a38-4ac6-91c8-822d298116a3-operator-scripts\") pod \"keystone-db-create-ptsql\" (UID: \"7bc49445-1a38-4ac6-91c8-822d298116a3\") " pod="openstack/keystone-db-create-ptsql" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.475132 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfjzg\" (UniqueName: \"kubernetes.io/projected/7bc49445-1a38-4ac6-91c8-822d298116a3-kube-api-access-rfjzg\") pod \"keystone-db-create-ptsql\" (UID: \"7bc49445-1a38-4ac6-91c8-822d298116a3\") " pod="openstack/keystone-db-create-ptsql" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.484829 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-x4tz7"] Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.485779 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x4tz7" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.496136 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-x4tz7"] Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.552623 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-959k2\" (UniqueName: \"kubernetes.io/projected/bc60d2f3-4d25-4e39-b7eb-70fb4e750774-kube-api-access-959k2\") pod \"keystone-9d81-account-create-update-kddpx\" (UID: \"bc60d2f3-4d25-4e39-b7eb-70fb4e750774\") " pod="openstack/keystone-9d81-account-create-update-kddpx" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.552694 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.552740 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxdzv\" (UniqueName: \"kubernetes.io/projected/3e1eddcd-d966-4a33-9b1e-639b3f5ebca7-kube-api-access-pxdzv\") pod \"placement-db-create-x4tz7\" (UID: \"3e1eddcd-d966-4a33-9b1e-639b3f5ebca7\") " pod="openstack/placement-db-create-x4tz7" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.552774 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc60d2f3-4d25-4e39-b7eb-70fb4e750774-operator-scripts\") pod \"keystone-9d81-account-create-update-kddpx\" (UID: \"bc60d2f3-4d25-4e39-b7eb-70fb4e750774\") " pod="openstack/keystone-9d81-account-create-update-kddpx" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.552798 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e1eddcd-d966-4a33-9b1e-639b3f5ebca7-operator-scripts\") pod \"placement-db-create-x4tz7\" (UID: \"3e1eddcd-d966-4a33-9b1e-639b3f5ebca7\") " pod="openstack/placement-db-create-x4tz7" Mar 13 09:30:45 crc kubenswrapper[4841]: E0313 09:30:45.552949 4841 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 09:30:45 crc kubenswrapper[4841]: E0313 09:30:45.552965 4841 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 09:30:45 crc kubenswrapper[4841]: E0313 09:30:45.553005 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift podName:64f62ec7-7a91-458c-86cb-7658544e4a51 nodeName:}" failed. No retries permitted until 2026-03-13 09:30:53.552990026 +0000 UTC m=+1136.282890227 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift") pod "swift-storage-0" (UID: "64f62ec7-7a91-458c-86cb-7658544e4a51") : configmap "swift-ring-files" not found Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.553819 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc60d2f3-4d25-4e39-b7eb-70fb4e750774-operator-scripts\") pod \"keystone-9d81-account-create-update-kddpx\" (UID: \"bc60d2f3-4d25-4e39-b7eb-70fb4e750774\") " pod="openstack/keystone-9d81-account-create-update-kddpx" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.571157 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-959k2\" (UniqueName: \"kubernetes.io/projected/bc60d2f3-4d25-4e39-b7eb-70fb4e750774-kube-api-access-959k2\") pod \"keystone-9d81-account-create-update-kddpx\" (UID: \"bc60d2f3-4d25-4e39-b7eb-70fb4e750774\") " pod="openstack/keystone-9d81-account-create-update-kddpx" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.607134 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3db9-account-create-update-8psnc"] Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.608338 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3db9-account-create-update-8psnc" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.609283 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ptsql" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.612732 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.624728 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3db9-account-create-update-8psnc"] Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.654130 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxdzv\" (UniqueName: \"kubernetes.io/projected/3e1eddcd-d966-4a33-9b1e-639b3f5ebca7-kube-api-access-pxdzv\") pod \"placement-db-create-x4tz7\" (UID: \"3e1eddcd-d966-4a33-9b1e-639b3f5ebca7\") " pod="openstack/placement-db-create-x4tz7" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.654195 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e1eddcd-d966-4a33-9b1e-639b3f5ebca7-operator-scripts\") pod \"placement-db-create-x4tz7\" (UID: \"3e1eddcd-d966-4a33-9b1e-639b3f5ebca7\") " pod="openstack/placement-db-create-x4tz7" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.654335 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjq9m\" (UniqueName: \"kubernetes.io/projected/f59d95da-19a9-42b3-b2f7-ca255806fabb-kube-api-access-fjq9m\") pod \"placement-3db9-account-create-update-8psnc\" (UID: \"f59d95da-19a9-42b3-b2f7-ca255806fabb\") " pod="openstack/placement-3db9-account-create-update-8psnc" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.654463 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f59d95da-19a9-42b3-b2f7-ca255806fabb-operator-scripts\") pod \"placement-3db9-account-create-update-8psnc\" (UID: \"f59d95da-19a9-42b3-b2f7-ca255806fabb\") " pod="openstack/placement-3db9-account-create-update-8psnc" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.655572 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e1eddcd-d966-4a33-9b1e-639b3f5ebca7-operator-scripts\") pod \"placement-db-create-x4tz7\" (UID: \"3e1eddcd-d966-4a33-9b1e-639b3f5ebca7\") " pod="openstack/placement-db-create-x4tz7" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.678933 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxdzv\" (UniqueName: \"kubernetes.io/projected/3e1eddcd-d966-4a33-9b1e-639b3f5ebca7-kube-api-access-pxdzv\") pod \"placement-db-create-x4tz7\" (UID: \"3e1eddcd-d966-4a33-9b1e-639b3f5ebca7\") " pod="openstack/placement-db-create-x4tz7" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.703526 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9d81-account-create-update-kddpx" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.757335 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjq9m\" (UniqueName: \"kubernetes.io/projected/f59d95da-19a9-42b3-b2f7-ca255806fabb-kube-api-access-fjq9m\") pod \"placement-3db9-account-create-update-8psnc\" (UID: \"f59d95da-19a9-42b3-b2f7-ca255806fabb\") " pod="openstack/placement-3db9-account-create-update-8psnc" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.757440 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f59d95da-19a9-42b3-b2f7-ca255806fabb-operator-scripts\") pod \"placement-3db9-account-create-update-8psnc\" (UID: \"f59d95da-19a9-42b3-b2f7-ca255806fabb\") " pod="openstack/placement-3db9-account-create-update-8psnc" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.758260 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f59d95da-19a9-42b3-b2f7-ca255806fabb-operator-scripts\") pod \"placement-3db9-account-create-update-8psnc\" (UID: \"f59d95da-19a9-42b3-b2f7-ca255806fabb\") " pod="openstack/placement-3db9-account-create-update-8psnc" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.773564 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjq9m\" (UniqueName: \"kubernetes.io/projected/f59d95da-19a9-42b3-b2f7-ca255806fabb-kube-api-access-fjq9m\") pod \"placement-3db9-account-create-update-8psnc\" (UID: \"f59d95da-19a9-42b3-b2f7-ca255806fabb\") " pod="openstack/placement-3db9-account-create-update-8psnc" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.820448 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x4tz7" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.904790 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p7rfs" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.964703 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3db9-account-create-update-8psnc" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.964930 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k967\" (UniqueName: \"kubernetes.io/projected/127599cf-60ed-4009-a303-d1071188fec4-kube-api-access-9k967\") pod \"127599cf-60ed-4009-a303-d1071188fec4\" (UID: \"127599cf-60ed-4009-a303-d1071188fec4\") " Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.965036 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/127599cf-60ed-4009-a303-d1071188fec4-operator-scripts\") pod \"127599cf-60ed-4009-a303-d1071188fec4\" (UID: \"127599cf-60ed-4009-a303-d1071188fec4\") " Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.965452 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/127599cf-60ed-4009-a303-d1071188fec4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "127599cf-60ed-4009-a303-d1071188fec4" (UID: "127599cf-60ed-4009-a303-d1071188fec4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.965597 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/127599cf-60ed-4009-a303-d1071188fec4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:45 crc kubenswrapper[4841]: I0313 09:30:45.968035 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/127599cf-60ed-4009-a303-d1071188fec4-kube-api-access-9k967" (OuterVolumeSpecName: "kube-api-access-9k967") pod "127599cf-60ed-4009-a303-d1071188fec4" (UID: "127599cf-60ed-4009-a303-d1071188fec4"). InnerVolumeSpecName "kube-api-access-9k967". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:30:46 crc kubenswrapper[4841]: I0313 09:30:46.069954 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k967\" (UniqueName: \"kubernetes.io/projected/127599cf-60ed-4009-a303-d1071188fec4-kube-api-access-9k967\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:46 crc kubenswrapper[4841]: I0313 09:30:46.449224 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9d81-account-create-update-kddpx"] Mar 13 09:30:46 crc kubenswrapper[4841]: I0313 09:30:46.469126 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-x4tz7"] Mar 13 09:30:46 crc kubenswrapper[4841]: W0313 09:30:46.473574 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e1eddcd_d966_4a33_9b1e_639b3f5ebca7.slice/crio-a4f09307184536a88e6730525afc1d715c51a814768d730e648ff48ac97f0245 WatchSource:0}: Error finding container a4f09307184536a88e6730525afc1d715c51a814768d730e648ff48ac97f0245: Status 404 returned error can't find the container with id a4f09307184536a88e6730525afc1d715c51a814768d730e648ff48ac97f0245 Mar 13 09:30:46 crc kubenswrapper[4841]: I0313 09:30:46.570221 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ptsql"] Mar 13 09:30:46 crc kubenswrapper[4841]: W0313 09:30:46.584737 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bc49445_1a38_4ac6_91c8_822d298116a3.slice/crio-4b306760c800bfeaf30a6a487557c58a4d2ea206ea673f10c152ec024cfcf13f WatchSource:0}: Error finding container 4b306760c800bfeaf30a6a487557c58a4d2ea206ea673f10c152ec024cfcf13f: Status 404 returned error can't find the container with id 4b306760c800bfeaf30a6a487557c58a4d2ea206ea673f10c152ec024cfcf13f Mar 13 09:30:46 crc kubenswrapper[4841]: I0313 09:30:46.612483 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3db9-account-create-update-8psnc"] Mar 13 09:30:46 crc kubenswrapper[4841]: W0313 09:30:46.624475 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf59d95da_19a9_42b3_b2f7_ca255806fabb.slice/crio-ab071d2d0c5bacb6016dd13a9c17286a92746ebedf6a633f8f2be0cf210ba64c WatchSource:0}: Error finding container ab071d2d0c5bacb6016dd13a9c17286a92746ebedf6a633f8f2be0cf210ba64c: Status 404 returned error can't find the container with id ab071d2d0c5bacb6016dd13a9c17286a92746ebedf6a633f8f2be0cf210ba64c Mar 13 09:30:46 crc kubenswrapper[4841]: I0313 09:30:46.645763 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-x4tz7" event={"ID":"3e1eddcd-d966-4a33-9b1e-639b3f5ebca7","Type":"ContainerStarted","Data":"a4f09307184536a88e6730525afc1d715c51a814768d730e648ff48ac97f0245"} Mar 13 09:30:46 crc kubenswrapper[4841]: I0313 09:30:46.648531 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-p7rfs" event={"ID":"127599cf-60ed-4009-a303-d1071188fec4","Type":"ContainerDied","Data":"dbbfee5514528139bd483e2235ff7bac31abb5f4ab04aed30770e6528c1fa716"} Mar 13 09:30:46 crc kubenswrapper[4841]: I0313 09:30:46.648575 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbbfee5514528139bd483e2235ff7bac31abb5f4ab04aed30770e6528c1fa716" Mar 13 09:30:46 crc kubenswrapper[4841]: I0313 09:30:46.648648 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p7rfs" Mar 13 09:30:46 crc kubenswrapper[4841]: I0313 09:30:46.658619 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9d81-account-create-update-kddpx" event={"ID":"bc60d2f3-4d25-4e39-b7eb-70fb4e750774","Type":"ContainerStarted","Data":"04e740b6bbe274ea9f35e08a86dabb718b79549d44f609b8996b1280afa21cbc"} Mar 13 09:30:46 crc kubenswrapper[4841]: I0313 09:30:46.662943 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-sqrfd" event={"ID":"f66d8c2c-71a2-4927-a708-4b1412d0243c","Type":"ContainerStarted","Data":"33df1d339a29a75cba5fe181ef4c7c0ce8f0f77964c2ebdf65e3c46e8853ea14"} Mar 13 09:30:46 crc kubenswrapper[4841]: I0313 09:30:46.664133 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3db9-account-create-update-8psnc" event={"ID":"f59d95da-19a9-42b3-b2f7-ca255806fabb","Type":"ContainerStarted","Data":"ab071d2d0c5bacb6016dd13a9c17286a92746ebedf6a633f8f2be0cf210ba64c"} Mar 13 09:30:46 crc kubenswrapper[4841]: I0313 09:30:46.666376 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ptsql" event={"ID":"7bc49445-1a38-4ac6-91c8-822d298116a3","Type":"ContainerStarted","Data":"4b306760c800bfeaf30a6a487557c58a4d2ea206ea673f10c152ec024cfcf13f"} Mar 13 09:30:46 crc kubenswrapper[4841]: I0313 09:30:46.698929 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-sqrfd" podStartSLOduration=2.14394863 podStartE2EDuration="5.698910002s" podCreationTimestamp="2026-03-13 09:30:41 +0000 UTC" firstStartedPulling="2026-03-13 09:30:42.489701822 +0000 UTC m=+1125.219602013" lastFinishedPulling="2026-03-13 09:30:46.044663194 +0000 UTC m=+1128.774563385" observedRunningTime="2026-03-13 09:30:46.688492214 +0000 UTC m=+1129.418392405" watchObservedRunningTime="2026-03-13 09:30:46.698910002 +0000 UTC m=+1129.428810193" Mar 13 09:30:46 crc kubenswrapper[4841]: I0313 09:30:46.983479 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.059561 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-97qpq"] Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.059836 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" podUID="c88febfd-c495-4b2e-a4fb-fca8f447ef9c" containerName="dnsmasq-dns" containerID="cri-o://324964d9fa4a4b07c58725cbb2e6c1ded9ffd6fff9771209e6d7d078c1ab118e" gracePeriod=10 Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.663974 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.677192 4841 generic.go:334] "Generic (PLEG): container finished" podID="f59d95da-19a9-42b3-b2f7-ca255806fabb" containerID="36f959254d7c4fccd4f4c8a169a57252925769952d5640801c2617d5d961f03a" exitCode=0 Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.677276 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3db9-account-create-update-8psnc" event={"ID":"f59d95da-19a9-42b3-b2f7-ca255806fabb","Type":"ContainerDied","Data":"36f959254d7c4fccd4f4c8a169a57252925769952d5640801c2617d5d961f03a"} Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.679218 4841 generic.go:334] "Generic (PLEG): container finished" podID="7bc49445-1a38-4ac6-91c8-822d298116a3" containerID="0c7ab363a1a3ee30521c07bd56f5e8a1ed231772685b2c16cf0ba4d2c4e08f46" exitCode=0 Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.679303 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ptsql" event={"ID":"7bc49445-1a38-4ac6-91c8-822d298116a3","Type":"ContainerDied","Data":"0c7ab363a1a3ee30521c07bd56f5e8a1ed231772685b2c16cf0ba4d2c4e08f46"} Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.683934 4841 generic.go:334] "Generic (PLEG): container finished" podID="3e1eddcd-d966-4a33-9b1e-639b3f5ebca7" containerID="12a87739800af3c8c6623ffe0a1a97e415d15d40e1938f65fa3b03c2da0012ea" exitCode=0 Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.684006 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-x4tz7" event={"ID":"3e1eddcd-d966-4a33-9b1e-639b3f5ebca7","Type":"ContainerDied","Data":"12a87739800af3c8c6623ffe0a1a97e415d15d40e1938f65fa3b03c2da0012ea"} Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.691079 4841 generic.go:334] "Generic (PLEG): container finished" podID="bc60d2f3-4d25-4e39-b7eb-70fb4e750774" containerID="0d995a6d118ccc6959f81ebfa6bca3ac21b379ee754eab4360c291f96d1c8f9c" exitCode=0 Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.691179 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9d81-account-create-update-kddpx" event={"ID":"bc60d2f3-4d25-4e39-b7eb-70fb4e750774","Type":"ContainerDied","Data":"0d995a6d118ccc6959f81ebfa6bca3ac21b379ee754eab4360c291f96d1c8f9c"} Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.694780 4841 generic.go:334] "Generic (PLEG): container finished" podID="c88febfd-c495-4b2e-a4fb-fca8f447ef9c" containerID="324964d9fa4a4b07c58725cbb2e6c1ded9ffd6fff9771209e6d7d078c1ab118e" exitCode=0 Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.695590 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.695734 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" event={"ID":"c88febfd-c495-4b2e-a4fb-fca8f447ef9c","Type":"ContainerDied","Data":"324964d9fa4a4b07c58725cbb2e6c1ded9ffd6fff9771209e6d7d078c1ab118e"} Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.695760 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-97qpq" event={"ID":"c88febfd-c495-4b2e-a4fb-fca8f447ef9c","Type":"ContainerDied","Data":"dfc43adb517d40b32e8b29b87fa34e312b580eb12648859f929e4b27de778665"} Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.695780 4841 scope.go:117] "RemoveContainer" containerID="324964d9fa4a4b07c58725cbb2e6c1ded9ffd6fff9771209e6d7d078c1ab118e" Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.730902 4841 scope.go:117] "RemoveContainer" containerID="f84151706634bc1d83dcfde710c5f1d8e05733d891fcd2d9b9059fc582b1eca0" Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.773876 4841 scope.go:117] "RemoveContainer" containerID="324964d9fa4a4b07c58725cbb2e6c1ded9ffd6fff9771209e6d7d078c1ab118e" Mar 13 09:30:47 crc kubenswrapper[4841]: E0313 09:30:47.774338 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"324964d9fa4a4b07c58725cbb2e6c1ded9ffd6fff9771209e6d7d078c1ab118e\": container with ID starting with 324964d9fa4a4b07c58725cbb2e6c1ded9ffd6fff9771209e6d7d078c1ab118e not found: ID does not exist" containerID="324964d9fa4a4b07c58725cbb2e6c1ded9ffd6fff9771209e6d7d078c1ab118e" Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.774385 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"324964d9fa4a4b07c58725cbb2e6c1ded9ffd6fff9771209e6d7d078c1ab118e"} err="failed to get container status \"324964d9fa4a4b07c58725cbb2e6c1ded9ffd6fff9771209e6d7d078c1ab118e\": rpc error: code = NotFound desc = could not find container \"324964d9fa4a4b07c58725cbb2e6c1ded9ffd6fff9771209e6d7d078c1ab118e\": container with ID starting with 324964d9fa4a4b07c58725cbb2e6c1ded9ffd6fff9771209e6d7d078c1ab118e not found: ID does not exist" Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.774414 4841 scope.go:117] "RemoveContainer" containerID="f84151706634bc1d83dcfde710c5f1d8e05733d891fcd2d9b9059fc582b1eca0" Mar 13 09:30:47 crc kubenswrapper[4841]: E0313 09:30:47.774797 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f84151706634bc1d83dcfde710c5f1d8e05733d891fcd2d9b9059fc582b1eca0\": container with ID starting with f84151706634bc1d83dcfde710c5f1d8e05733d891fcd2d9b9059fc582b1eca0 not found: ID does not exist" containerID="f84151706634bc1d83dcfde710c5f1d8e05733d891fcd2d9b9059fc582b1eca0" Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.774817 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f84151706634bc1d83dcfde710c5f1d8e05733d891fcd2d9b9059fc582b1eca0"} err="failed to get container status \"f84151706634bc1d83dcfde710c5f1d8e05733d891fcd2d9b9059fc582b1eca0\": rpc error: code = NotFound desc = could not find container \"f84151706634bc1d83dcfde710c5f1d8e05733d891fcd2d9b9059fc582b1eca0\": container with ID starting with f84151706634bc1d83dcfde710c5f1d8e05733d891fcd2d9b9059fc582b1eca0 not found: ID does not exist" Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.808894 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c88febfd-c495-4b2e-a4fb-fca8f447ef9c-config\") pod \"c88febfd-c495-4b2e-a4fb-fca8f447ef9c\" (UID: \"c88febfd-c495-4b2e-a4fb-fca8f447ef9c\") " Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.809940 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c88febfd-c495-4b2e-a4fb-fca8f447ef9c-dns-svc\") pod \"c88febfd-c495-4b2e-a4fb-fca8f447ef9c\" (UID: \"c88febfd-c495-4b2e-a4fb-fca8f447ef9c\") " Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.810000 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x9sx\" (UniqueName: \"kubernetes.io/projected/c88febfd-c495-4b2e-a4fb-fca8f447ef9c-kube-api-access-5x9sx\") pod \"c88febfd-c495-4b2e-a4fb-fca8f447ef9c\" (UID: \"c88febfd-c495-4b2e-a4fb-fca8f447ef9c\") " Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.814698 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c88febfd-c495-4b2e-a4fb-fca8f447ef9c-kube-api-access-5x9sx" (OuterVolumeSpecName: "kube-api-access-5x9sx") pod "c88febfd-c495-4b2e-a4fb-fca8f447ef9c" (UID: "c88febfd-c495-4b2e-a4fb-fca8f447ef9c"). InnerVolumeSpecName "kube-api-access-5x9sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.845713 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c88febfd-c495-4b2e-a4fb-fca8f447ef9c-config" (OuterVolumeSpecName: "config") pod "c88febfd-c495-4b2e-a4fb-fca8f447ef9c" (UID: "c88febfd-c495-4b2e-a4fb-fca8f447ef9c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.854067 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c88febfd-c495-4b2e-a4fb-fca8f447ef9c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c88febfd-c495-4b2e-a4fb-fca8f447ef9c" (UID: "c88febfd-c495-4b2e-a4fb-fca8f447ef9c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.912496 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x9sx\" (UniqueName: \"kubernetes.io/projected/c88febfd-c495-4b2e-a4fb-fca8f447ef9c-kube-api-access-5x9sx\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.912526 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c88febfd-c495-4b2e-a4fb-fca8f447ef9c-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:47 crc kubenswrapper[4841]: I0313 09:30:47.912534 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c88febfd-c495-4b2e-a4fb-fca8f447ef9c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:48 crc kubenswrapper[4841]: I0313 09:30:48.045246 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-97qpq"] Mar 13 09:30:48 crc kubenswrapper[4841]: I0313 09:30:48.053154 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-97qpq"] Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.106029 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9d81-account-create-update-kddpx" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.187828 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ptsql" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.207871 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3db9-account-create-update-8psnc" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.211389 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x4tz7" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.250105 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-959k2\" (UniqueName: \"kubernetes.io/projected/bc60d2f3-4d25-4e39-b7eb-70fb4e750774-kube-api-access-959k2\") pod \"bc60d2f3-4d25-4e39-b7eb-70fb4e750774\" (UID: \"bc60d2f3-4d25-4e39-b7eb-70fb4e750774\") " Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.250435 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfjzg\" (UniqueName: \"kubernetes.io/projected/7bc49445-1a38-4ac6-91c8-822d298116a3-kube-api-access-rfjzg\") pod \"7bc49445-1a38-4ac6-91c8-822d298116a3\" (UID: \"7bc49445-1a38-4ac6-91c8-822d298116a3\") " Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.250750 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc60d2f3-4d25-4e39-b7eb-70fb4e750774-operator-scripts\") pod \"bc60d2f3-4d25-4e39-b7eb-70fb4e750774\" (UID: \"bc60d2f3-4d25-4e39-b7eb-70fb4e750774\") " Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.250882 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc49445-1a38-4ac6-91c8-822d298116a3-operator-scripts\") pod \"7bc49445-1a38-4ac6-91c8-822d298116a3\" (UID: \"7bc49445-1a38-4ac6-91c8-822d298116a3\") " Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.252730 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bc49445-1a38-4ac6-91c8-822d298116a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7bc49445-1a38-4ac6-91c8-822d298116a3" (UID: "7bc49445-1a38-4ac6-91c8-822d298116a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.255164 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc60d2f3-4d25-4e39-b7eb-70fb4e750774-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bc60d2f3-4d25-4e39-b7eb-70fb4e750774" (UID: "bc60d2f3-4d25-4e39-b7eb-70fb4e750774"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.260532 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc60d2f3-4d25-4e39-b7eb-70fb4e750774-kube-api-access-959k2" (OuterVolumeSpecName: "kube-api-access-959k2") pod "bc60d2f3-4d25-4e39-b7eb-70fb4e750774" (UID: "bc60d2f3-4d25-4e39-b7eb-70fb4e750774"). InnerVolumeSpecName "kube-api-access-959k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.262344 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bc49445-1a38-4ac6-91c8-822d298116a3-kube-api-access-rfjzg" (OuterVolumeSpecName: "kube-api-access-rfjzg") pod "7bc49445-1a38-4ac6-91c8-822d298116a3" (UID: "7bc49445-1a38-4ac6-91c8-822d298116a3"). InnerVolumeSpecName "kube-api-access-rfjzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.352965 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e1eddcd-d966-4a33-9b1e-639b3f5ebca7-operator-scripts\") pod \"3e1eddcd-d966-4a33-9b1e-639b3f5ebca7\" (UID: \"3e1eddcd-d966-4a33-9b1e-639b3f5ebca7\") " Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.353086 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f59d95da-19a9-42b3-b2f7-ca255806fabb-operator-scripts\") pod \"f59d95da-19a9-42b3-b2f7-ca255806fabb\" (UID: \"f59d95da-19a9-42b3-b2f7-ca255806fabb\") " Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.353113 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxdzv\" (UniqueName: \"kubernetes.io/projected/3e1eddcd-d966-4a33-9b1e-639b3f5ebca7-kube-api-access-pxdzv\") pod \"3e1eddcd-d966-4a33-9b1e-639b3f5ebca7\" (UID: \"3e1eddcd-d966-4a33-9b1e-639b3f5ebca7\") " Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.353208 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjq9m\" (UniqueName: \"kubernetes.io/projected/f59d95da-19a9-42b3-b2f7-ca255806fabb-kube-api-access-fjq9m\") pod \"f59d95da-19a9-42b3-b2f7-ca255806fabb\" (UID: \"f59d95da-19a9-42b3-b2f7-ca255806fabb\") " Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.353474 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e1eddcd-d966-4a33-9b1e-639b3f5ebca7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e1eddcd-d966-4a33-9b1e-639b3f5ebca7" (UID: "3e1eddcd-d966-4a33-9b1e-639b3f5ebca7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.353896 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59d95da-19a9-42b3-b2f7-ca255806fabb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f59d95da-19a9-42b3-b2f7-ca255806fabb" (UID: "f59d95da-19a9-42b3-b2f7-ca255806fabb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.354121 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f59d95da-19a9-42b3-b2f7-ca255806fabb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.354173 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc49445-1a38-4ac6-91c8-822d298116a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.354187 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-959k2\" (UniqueName: \"kubernetes.io/projected/bc60d2f3-4d25-4e39-b7eb-70fb4e750774-kube-api-access-959k2\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.354202 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfjzg\" (UniqueName: \"kubernetes.io/projected/7bc49445-1a38-4ac6-91c8-822d298116a3-kube-api-access-rfjzg\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.354213 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e1eddcd-d966-4a33-9b1e-639b3f5ebca7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.354224 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc60d2f3-4d25-4e39-b7eb-70fb4e750774-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.356259 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e1eddcd-d966-4a33-9b1e-639b3f5ebca7-kube-api-access-pxdzv" (OuterVolumeSpecName: "kube-api-access-pxdzv") pod "3e1eddcd-d966-4a33-9b1e-639b3f5ebca7" (UID: "3e1eddcd-d966-4a33-9b1e-639b3f5ebca7"). InnerVolumeSpecName "kube-api-access-pxdzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.362871 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f59d95da-19a9-42b3-b2f7-ca255806fabb-kube-api-access-fjq9m" (OuterVolumeSpecName: "kube-api-access-fjq9m") pod "f59d95da-19a9-42b3-b2f7-ca255806fabb" (UID: "f59d95da-19a9-42b3-b2f7-ca255806fabb"). InnerVolumeSpecName "kube-api-access-fjq9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.455576 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxdzv\" (UniqueName: \"kubernetes.io/projected/3e1eddcd-d966-4a33-9b1e-639b3f5ebca7-kube-api-access-pxdzv\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.455616 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjq9m\" (UniqueName: \"kubernetes.io/projected/f59d95da-19a9-42b3-b2f7-ca255806fabb-kube-api-access-fjq9m\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.714520 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-x4tz7" event={"ID":"3e1eddcd-d966-4a33-9b1e-639b3f5ebca7","Type":"ContainerDied","Data":"a4f09307184536a88e6730525afc1d715c51a814768d730e648ff48ac97f0245"} Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.714564 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4f09307184536a88e6730525afc1d715c51a814768d730e648ff48ac97f0245" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.714579 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x4tz7" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.716380 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9d81-account-create-update-kddpx" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.716380 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9d81-account-create-update-kddpx" event={"ID":"bc60d2f3-4d25-4e39-b7eb-70fb4e750774","Type":"ContainerDied","Data":"04e740b6bbe274ea9f35e08a86dabb718b79549d44f609b8996b1280afa21cbc"} Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.716535 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04e740b6bbe274ea9f35e08a86dabb718b79549d44f609b8996b1280afa21cbc" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.718050 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3db9-account-create-update-8psnc" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.718051 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3db9-account-create-update-8psnc" event={"ID":"f59d95da-19a9-42b3-b2f7-ca255806fabb","Type":"ContainerDied","Data":"ab071d2d0c5bacb6016dd13a9c17286a92746ebedf6a633f8f2be0cf210ba64c"} Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.718199 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab071d2d0c5bacb6016dd13a9c17286a92746ebedf6a633f8f2be0cf210ba64c" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.720026 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ptsql" event={"ID":"7bc49445-1a38-4ac6-91c8-822d298116a3","Type":"ContainerDied","Data":"4b306760c800bfeaf30a6a487557c58a4d2ea206ea673f10c152ec024cfcf13f"} Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.720062 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b306760c800bfeaf30a6a487557c58a4d2ea206ea673f10c152ec024cfcf13f" Mar 13 09:30:49 crc kubenswrapper[4841]: I0313 09:30:49.720071 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ptsql" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.022709 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c88febfd-c495-4b2e-a4fb-fca8f447ef9c" path="/var/lib/kubelet/pods/c88febfd-c495-4b2e-a4fb-fca8f447ef9c/volumes" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.724172 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-p7rfs"] Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.729722 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-p7rfs"] Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.803357 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-dqccr"] Mar 13 09:30:50 crc kubenswrapper[4841]: E0313 09:30:50.803726 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e1eddcd-d966-4a33-9b1e-639b3f5ebca7" containerName="mariadb-database-create" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.803747 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e1eddcd-d966-4a33-9b1e-639b3f5ebca7" containerName="mariadb-database-create" Mar 13 09:30:50 crc kubenswrapper[4841]: E0313 09:30:50.803765 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc60d2f3-4d25-4e39-b7eb-70fb4e750774" containerName="mariadb-account-create-update" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.803774 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc60d2f3-4d25-4e39-b7eb-70fb4e750774" containerName="mariadb-account-create-update" Mar 13 09:30:50 crc kubenswrapper[4841]: E0313 09:30:50.803795 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f59d95da-19a9-42b3-b2f7-ca255806fabb" containerName="mariadb-account-create-update" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.803805 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f59d95da-19a9-42b3-b2f7-ca255806fabb" containerName="mariadb-account-create-update" Mar 13 09:30:50 crc kubenswrapper[4841]: E0313 09:30:50.803819 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127599cf-60ed-4009-a303-d1071188fec4" containerName="mariadb-account-create-update" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.803828 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="127599cf-60ed-4009-a303-d1071188fec4" containerName="mariadb-account-create-update" Mar 13 09:30:50 crc kubenswrapper[4841]: E0313 09:30:50.803839 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c88febfd-c495-4b2e-a4fb-fca8f447ef9c" containerName="dnsmasq-dns" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.803846 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c88febfd-c495-4b2e-a4fb-fca8f447ef9c" containerName="dnsmasq-dns" Mar 13 09:30:50 crc kubenswrapper[4841]: E0313 09:30:50.803869 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c88febfd-c495-4b2e-a4fb-fca8f447ef9c" containerName="init" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.803877 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c88febfd-c495-4b2e-a4fb-fca8f447ef9c" containerName="init" Mar 13 09:30:50 crc kubenswrapper[4841]: E0313 09:30:50.803894 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc49445-1a38-4ac6-91c8-822d298116a3" containerName="mariadb-database-create" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.803902 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc49445-1a38-4ac6-91c8-822d298116a3" containerName="mariadb-database-create" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.804072 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e1eddcd-d966-4a33-9b1e-639b3f5ebca7" containerName="mariadb-database-create" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.804096 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f59d95da-19a9-42b3-b2f7-ca255806fabb" containerName="mariadb-account-create-update" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.804111 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc60d2f3-4d25-4e39-b7eb-70fb4e750774" containerName="mariadb-account-create-update" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.804121 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bc49445-1a38-4ac6-91c8-822d298116a3" containerName="mariadb-database-create" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.804133 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="127599cf-60ed-4009-a303-d1071188fec4" containerName="mariadb-account-create-update" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.804144 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c88febfd-c495-4b2e-a4fb-fca8f447ef9c" containerName="dnsmasq-dns" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.804691 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dqccr" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.806810 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.817038 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dqccr"] Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.907970 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d2dm\" (UniqueName: \"kubernetes.io/projected/78fb8b1f-a395-4430-a2db-267939774965-kube-api-access-6d2dm\") pod \"root-account-create-update-dqccr\" (UID: \"78fb8b1f-a395-4430-a2db-267939774965\") " pod="openstack/root-account-create-update-dqccr" Mar 13 09:30:50 crc kubenswrapper[4841]: I0313 09:30:50.908011 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78fb8b1f-a395-4430-a2db-267939774965-operator-scripts\") pod \"root-account-create-update-dqccr\" (UID: \"78fb8b1f-a395-4430-a2db-267939774965\") " pod="openstack/root-account-create-update-dqccr" Mar 13 09:30:51 crc kubenswrapper[4841]: I0313 09:30:51.009584 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78fb8b1f-a395-4430-a2db-267939774965-operator-scripts\") pod \"root-account-create-update-dqccr\" (UID: \"78fb8b1f-a395-4430-a2db-267939774965\") " pod="openstack/root-account-create-update-dqccr" Mar 13 09:30:51 crc kubenswrapper[4841]: I0313 09:30:51.009796 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d2dm\" (UniqueName: \"kubernetes.io/projected/78fb8b1f-a395-4430-a2db-267939774965-kube-api-access-6d2dm\") pod \"root-account-create-update-dqccr\" (UID: \"78fb8b1f-a395-4430-a2db-267939774965\") " pod="openstack/root-account-create-update-dqccr" Mar 13 09:30:51 crc kubenswrapper[4841]: I0313 09:30:51.010948 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78fb8b1f-a395-4430-a2db-267939774965-operator-scripts\") pod \"root-account-create-update-dqccr\" (UID: \"78fb8b1f-a395-4430-a2db-267939774965\") " pod="openstack/root-account-create-update-dqccr" Mar 13 09:30:51 crc kubenswrapper[4841]: I0313 09:30:51.027119 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d2dm\" (UniqueName: \"kubernetes.io/projected/78fb8b1f-a395-4430-a2db-267939774965-kube-api-access-6d2dm\") pod \"root-account-create-update-dqccr\" (UID: \"78fb8b1f-a395-4430-a2db-267939774965\") " pod="openstack/root-account-create-update-dqccr" Mar 13 09:30:51 crc kubenswrapper[4841]: I0313 09:30:51.177414 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dqccr" Mar 13 09:30:51 crc kubenswrapper[4841]: I0313 09:30:51.638340 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dqccr"] Mar 13 09:30:51 crc kubenswrapper[4841]: I0313 09:30:51.737543 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dqccr" event={"ID":"78fb8b1f-a395-4430-a2db-267939774965","Type":"ContainerStarted","Data":"acfe6967bfa70ff4d6117f18d0b59fd347cf57b59a3582de17c5aa68a5b261b2"} Mar 13 09:30:51 crc kubenswrapper[4841]: I0313 09:30:51.764197 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-78p58"] Mar 13 09:30:51 crc kubenswrapper[4841]: I0313 09:30:51.765984 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-78p58" Mar 13 09:30:51 crc kubenswrapper[4841]: I0313 09:30:51.774769 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-78p58"] Mar 13 09:30:51 crc kubenswrapper[4841]: I0313 09:30:51.908759 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5e0a-account-create-update-tk6x7"] Mar 13 09:30:51 crc kubenswrapper[4841]: I0313 09:30:51.910224 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5e0a-account-create-update-tk6x7" Mar 13 09:30:51 crc kubenswrapper[4841]: I0313 09:30:51.917617 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 13 09:30:51 crc kubenswrapper[4841]: I0313 09:30:51.921722 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5e0a-account-create-update-tk6x7"] Mar 13 09:30:51 crc kubenswrapper[4841]: I0313 09:30:51.924880 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm4nn\" (UniqueName: \"kubernetes.io/projected/9ae264c3-b8e8-432e-9ee8-8ff04e30aac5-kube-api-access-rm4nn\") pod \"glance-db-create-78p58\" (UID: \"9ae264c3-b8e8-432e-9ee8-8ff04e30aac5\") " pod="openstack/glance-db-create-78p58" Mar 13 09:30:51 crc kubenswrapper[4841]: I0313 09:30:51.924952 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ae264c3-b8e8-432e-9ee8-8ff04e30aac5-operator-scripts\") pod \"glance-db-create-78p58\" (UID: \"9ae264c3-b8e8-432e-9ee8-8ff04e30aac5\") " pod="openstack/glance-db-create-78p58" Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.005778 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="127599cf-60ed-4009-a303-d1071188fec4" path="/var/lib/kubelet/pods/127599cf-60ed-4009-a303-d1071188fec4/volumes" Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.025988 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcs55\" (UniqueName: \"kubernetes.io/projected/f18b3ca4-294d-46f3-8b5b-c5e297bf58fe-kube-api-access-qcs55\") pod \"glance-5e0a-account-create-update-tk6x7\" (UID: \"f18b3ca4-294d-46f3-8b5b-c5e297bf58fe\") " pod="openstack/glance-5e0a-account-create-update-tk6x7" Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.026046 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm4nn\" (UniqueName: \"kubernetes.io/projected/9ae264c3-b8e8-432e-9ee8-8ff04e30aac5-kube-api-access-rm4nn\") pod \"glance-db-create-78p58\" (UID: \"9ae264c3-b8e8-432e-9ee8-8ff04e30aac5\") " pod="openstack/glance-db-create-78p58" Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.026109 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f18b3ca4-294d-46f3-8b5b-c5e297bf58fe-operator-scripts\") pod \"glance-5e0a-account-create-update-tk6x7\" (UID: \"f18b3ca4-294d-46f3-8b5b-c5e297bf58fe\") " pod="openstack/glance-5e0a-account-create-update-tk6x7" Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.026137 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ae264c3-b8e8-432e-9ee8-8ff04e30aac5-operator-scripts\") pod \"glance-db-create-78p58\" (UID: \"9ae264c3-b8e8-432e-9ee8-8ff04e30aac5\") " pod="openstack/glance-db-create-78p58" Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.028313 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ae264c3-b8e8-432e-9ee8-8ff04e30aac5-operator-scripts\") pod \"glance-db-create-78p58\" (UID: \"9ae264c3-b8e8-432e-9ee8-8ff04e30aac5\") " pod="openstack/glance-db-create-78p58" Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.053425 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm4nn\" (UniqueName: \"kubernetes.io/projected/9ae264c3-b8e8-432e-9ee8-8ff04e30aac5-kube-api-access-rm4nn\") pod \"glance-db-create-78p58\" (UID: \"9ae264c3-b8e8-432e-9ee8-8ff04e30aac5\") " pod="openstack/glance-db-create-78p58" Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.104839 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-78p58" Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.126889 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f18b3ca4-294d-46f3-8b5b-c5e297bf58fe-operator-scripts\") pod \"glance-5e0a-account-create-update-tk6x7\" (UID: \"f18b3ca4-294d-46f3-8b5b-c5e297bf58fe\") " pod="openstack/glance-5e0a-account-create-update-tk6x7" Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.127001 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcs55\" (UniqueName: \"kubernetes.io/projected/f18b3ca4-294d-46f3-8b5b-c5e297bf58fe-kube-api-access-qcs55\") pod \"glance-5e0a-account-create-update-tk6x7\" (UID: \"f18b3ca4-294d-46f3-8b5b-c5e297bf58fe\") " pod="openstack/glance-5e0a-account-create-update-tk6x7" Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.128498 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f18b3ca4-294d-46f3-8b5b-c5e297bf58fe-operator-scripts\") pod \"glance-5e0a-account-create-update-tk6x7\" (UID: \"f18b3ca4-294d-46f3-8b5b-c5e297bf58fe\") " pod="openstack/glance-5e0a-account-create-update-tk6x7" Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.147467 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcs55\" (UniqueName: \"kubernetes.io/projected/f18b3ca4-294d-46f3-8b5b-c5e297bf58fe-kube-api-access-qcs55\") pod \"glance-5e0a-account-create-update-tk6x7\" (UID: \"f18b3ca4-294d-46f3-8b5b-c5e297bf58fe\") " pod="openstack/glance-5e0a-account-create-update-tk6x7" Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.234488 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5e0a-account-create-update-tk6x7" Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.599308 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-78p58"] Mar 13 09:30:52 crc kubenswrapper[4841]: W0313 09:30:52.604012 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ae264c3_b8e8_432e_9ee8_8ff04e30aac5.slice/crio-fb6ed82fcf54009db3690187e56479b030a5352a86dbfafbb9d79f5fbdb8f635 WatchSource:0}: Error finding container fb6ed82fcf54009db3690187e56479b030a5352a86dbfafbb9d79f5fbdb8f635: Status 404 returned error can't find the container with id fb6ed82fcf54009db3690187e56479b030a5352a86dbfafbb9d79f5fbdb8f635 Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.689122 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5e0a-account-create-update-tk6x7"] Mar 13 09:30:52 crc kubenswrapper[4841]: W0313 09:30:52.695680 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf18b3ca4_294d_46f3_8b5b_c5e297bf58fe.slice/crio-929d8ef31effd060921a74a27cabbc595b32fac0c42ff9ce17d5a9ccbac54a42 WatchSource:0}: Error finding container 929d8ef31effd060921a74a27cabbc595b32fac0c42ff9ce17d5a9ccbac54a42: Status 404 returned error can't find the container with id 929d8ef31effd060921a74a27cabbc595b32fac0c42ff9ce17d5a9ccbac54a42 Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.745415 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-78p58" event={"ID":"9ae264c3-b8e8-432e-9ee8-8ff04e30aac5","Type":"ContainerStarted","Data":"fb6ed82fcf54009db3690187e56479b030a5352a86dbfafbb9d79f5fbdb8f635"} Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.747796 4841 generic.go:334] "Generic (PLEG): container finished" podID="78fb8b1f-a395-4430-a2db-267939774965" containerID="848fba9a8bda88cf73d02a4d99686a491503389ddbfda3664c89191cddea4f79" exitCode=0 Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.747857 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dqccr" event={"ID":"78fb8b1f-a395-4430-a2db-267939774965","Type":"ContainerDied","Data":"848fba9a8bda88cf73d02a4d99686a491503389ddbfda3664c89191cddea4f79"} Mar 13 09:30:52 crc kubenswrapper[4841]: I0313 09:30:52.749324 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5e0a-account-create-update-tk6x7" event={"ID":"f18b3ca4-294d-46f3-8b5b-c5e297bf58fe","Type":"ContainerStarted","Data":"929d8ef31effd060921a74a27cabbc595b32fac0c42ff9ce17d5a9ccbac54a42"} Mar 13 09:30:53 crc kubenswrapper[4841]: I0313 09:30:53.557136 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:30:53 crc kubenswrapper[4841]: E0313 09:30:53.557476 4841 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 09:30:53 crc kubenswrapper[4841]: E0313 09:30:53.557520 4841 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 09:30:53 crc kubenswrapper[4841]: E0313 09:30:53.557616 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift podName:64f62ec7-7a91-458c-86cb-7658544e4a51 nodeName:}" failed. No retries permitted until 2026-03-13 09:31:09.557588047 +0000 UTC m=+1152.287488278 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift") pod "swift-storage-0" (UID: "64f62ec7-7a91-458c-86cb-7658544e4a51") : configmap "swift-ring-files" not found Mar 13 09:30:53 crc kubenswrapper[4841]: I0313 09:30:53.772159 4841 generic.go:334] "Generic (PLEG): container finished" podID="f18b3ca4-294d-46f3-8b5b-c5e297bf58fe" containerID="c62cdd2b0afd7ccff63f60a98d18f0cf82cee5e50e25e0d75f70ec3af33af351" exitCode=0 Mar 13 09:30:53 crc kubenswrapper[4841]: I0313 09:30:53.772690 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5e0a-account-create-update-tk6x7" event={"ID":"f18b3ca4-294d-46f3-8b5b-c5e297bf58fe","Type":"ContainerDied","Data":"c62cdd2b0afd7ccff63f60a98d18f0cf82cee5e50e25e0d75f70ec3af33af351"} Mar 13 09:30:53 crc kubenswrapper[4841]: I0313 09:30:53.778508 4841 generic.go:334] "Generic (PLEG): container finished" podID="f66d8c2c-71a2-4927-a708-4b1412d0243c" containerID="33df1d339a29a75cba5fe181ef4c7c0ce8f0f77964c2ebdf65e3c46e8853ea14" exitCode=0 Mar 13 09:30:53 crc kubenswrapper[4841]: I0313 09:30:53.778730 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-sqrfd" event={"ID":"f66d8c2c-71a2-4927-a708-4b1412d0243c","Type":"ContainerDied","Data":"33df1d339a29a75cba5fe181ef4c7c0ce8f0f77964c2ebdf65e3c46e8853ea14"} Mar 13 09:30:53 crc kubenswrapper[4841]: I0313 09:30:53.780831 4841 generic.go:334] "Generic (PLEG): container finished" podID="9ae264c3-b8e8-432e-9ee8-8ff04e30aac5" containerID="70740dcd9265a9aa862721c88f0efad402c1d92f5d10f51f4e6782e9c68da467" exitCode=0 Mar 13 09:30:53 crc kubenswrapper[4841]: I0313 09:30:53.781022 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-78p58" event={"ID":"9ae264c3-b8e8-432e-9ee8-8ff04e30aac5","Type":"ContainerDied","Data":"70740dcd9265a9aa862721c88f0efad402c1d92f5d10f51f4e6782e9c68da467"} Mar 13 09:30:54 crc kubenswrapper[4841]: I0313 09:30:54.070918 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dqccr" Mar 13 09:30:54 crc kubenswrapper[4841]: I0313 09:30:54.171380 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78fb8b1f-a395-4430-a2db-267939774965-operator-scripts\") pod \"78fb8b1f-a395-4430-a2db-267939774965\" (UID: \"78fb8b1f-a395-4430-a2db-267939774965\") " Mar 13 09:30:54 crc kubenswrapper[4841]: I0313 09:30:54.171457 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d2dm\" (UniqueName: \"kubernetes.io/projected/78fb8b1f-a395-4430-a2db-267939774965-kube-api-access-6d2dm\") pod \"78fb8b1f-a395-4430-a2db-267939774965\" (UID: \"78fb8b1f-a395-4430-a2db-267939774965\") " Mar 13 09:30:54 crc kubenswrapper[4841]: I0313 09:30:54.172817 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78fb8b1f-a395-4430-a2db-267939774965-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "78fb8b1f-a395-4430-a2db-267939774965" (UID: "78fb8b1f-a395-4430-a2db-267939774965"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:54 crc kubenswrapper[4841]: I0313 09:30:54.199484 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78fb8b1f-a395-4430-a2db-267939774965-kube-api-access-6d2dm" (OuterVolumeSpecName: "kube-api-access-6d2dm") pod "78fb8b1f-a395-4430-a2db-267939774965" (UID: "78fb8b1f-a395-4430-a2db-267939774965"). InnerVolumeSpecName "kube-api-access-6d2dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:30:54 crc kubenswrapper[4841]: I0313 09:30:54.273601 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78fb8b1f-a395-4430-a2db-267939774965-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:54 crc kubenswrapper[4841]: I0313 09:30:54.273865 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d2dm\" (UniqueName: \"kubernetes.io/projected/78fb8b1f-a395-4430-a2db-267939774965-kube-api-access-6d2dm\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:54 crc kubenswrapper[4841]: I0313 09:30:54.823995 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dqccr" event={"ID":"78fb8b1f-a395-4430-a2db-267939774965","Type":"ContainerDied","Data":"acfe6967bfa70ff4d6117f18d0b59fd347cf57b59a3582de17c5aa68a5b261b2"} Mar 13 09:30:54 crc kubenswrapper[4841]: I0313 09:30:54.824056 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acfe6967bfa70ff4d6117f18d0b59fd347cf57b59a3582de17c5aa68a5b261b2" Mar 13 09:30:54 crc kubenswrapper[4841]: I0313 09:30:54.824119 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dqccr" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.319514 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5e0a-account-create-update-tk6x7" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.325904 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.330445 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-78p58" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.389998 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f66d8c2c-71a2-4927-a708-4b1412d0243c-scripts\") pod \"f66d8c2c-71a2-4927-a708-4b1412d0243c\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.390035 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f66d8c2c-71a2-4927-a708-4b1412d0243c-dispersionconf\") pod \"f66d8c2c-71a2-4927-a708-4b1412d0243c\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.390116 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ae264c3-b8e8-432e-9ee8-8ff04e30aac5-operator-scripts\") pod \"9ae264c3-b8e8-432e-9ee8-8ff04e30aac5\" (UID: \"9ae264c3-b8e8-432e-9ee8-8ff04e30aac5\") " Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.390162 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm4nn\" (UniqueName: \"kubernetes.io/projected/9ae264c3-b8e8-432e-9ee8-8ff04e30aac5-kube-api-access-rm4nn\") pod \"9ae264c3-b8e8-432e-9ee8-8ff04e30aac5\" (UID: \"9ae264c3-b8e8-432e-9ee8-8ff04e30aac5\") " Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.390186 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f66d8c2c-71a2-4927-a708-4b1412d0243c-ring-data-devices\") pod \"f66d8c2c-71a2-4927-a708-4b1412d0243c\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.390206 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f66d8c2c-71a2-4927-a708-4b1412d0243c-etc-swift\") pod \"f66d8c2c-71a2-4927-a708-4b1412d0243c\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.390251 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66d8c2c-71a2-4927-a708-4b1412d0243c-combined-ca-bundle\") pod \"f66d8c2c-71a2-4927-a708-4b1412d0243c\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.390295 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f18b3ca4-294d-46f3-8b5b-c5e297bf58fe-operator-scripts\") pod \"f18b3ca4-294d-46f3-8b5b-c5e297bf58fe\" (UID: \"f18b3ca4-294d-46f3-8b5b-c5e297bf58fe\") " Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.390355 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcs55\" (UniqueName: \"kubernetes.io/projected/f18b3ca4-294d-46f3-8b5b-c5e297bf58fe-kube-api-access-qcs55\") pod \"f18b3ca4-294d-46f3-8b5b-c5e297bf58fe\" (UID: \"f18b3ca4-294d-46f3-8b5b-c5e297bf58fe\") " Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.390399 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl6zh\" (UniqueName: \"kubernetes.io/projected/f66d8c2c-71a2-4927-a708-4b1412d0243c-kube-api-access-pl6zh\") pod \"f66d8c2c-71a2-4927-a708-4b1412d0243c\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.390416 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f66d8c2c-71a2-4927-a708-4b1412d0243c-swiftconf\") pod \"f66d8c2c-71a2-4927-a708-4b1412d0243c\" (UID: \"f66d8c2c-71a2-4927-a708-4b1412d0243c\") " Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.391026 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f66d8c2c-71a2-4927-a708-4b1412d0243c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f66d8c2c-71a2-4927-a708-4b1412d0243c" (UID: "f66d8c2c-71a2-4927-a708-4b1412d0243c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.391024 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f18b3ca4-294d-46f3-8b5b-c5e297bf58fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f18b3ca4-294d-46f3-8b5b-c5e297bf58fe" (UID: "f18b3ca4-294d-46f3-8b5b-c5e297bf58fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.391464 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ae264c3-b8e8-432e-9ee8-8ff04e30aac5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ae264c3-b8e8-432e-9ee8-8ff04e30aac5" (UID: "9ae264c3-b8e8-432e-9ee8-8ff04e30aac5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.391810 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f66d8c2c-71a2-4927-a708-4b1412d0243c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f66d8c2c-71a2-4927-a708-4b1412d0243c" (UID: "f66d8c2c-71a2-4927-a708-4b1412d0243c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.395783 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f18b3ca4-294d-46f3-8b5b-c5e297bf58fe-kube-api-access-qcs55" (OuterVolumeSpecName: "kube-api-access-qcs55") pod "f18b3ca4-294d-46f3-8b5b-c5e297bf58fe" (UID: "f18b3ca4-294d-46f3-8b5b-c5e297bf58fe"). InnerVolumeSpecName "kube-api-access-qcs55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.396814 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ae264c3-b8e8-432e-9ee8-8ff04e30aac5-kube-api-access-rm4nn" (OuterVolumeSpecName: "kube-api-access-rm4nn") pod "9ae264c3-b8e8-432e-9ee8-8ff04e30aac5" (UID: "9ae264c3-b8e8-432e-9ee8-8ff04e30aac5"). InnerVolumeSpecName "kube-api-access-rm4nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.404580 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f66d8c2c-71a2-4927-a708-4b1412d0243c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f66d8c2c-71a2-4927-a708-4b1412d0243c" (UID: "f66d8c2c-71a2-4927-a708-4b1412d0243c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.405006 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f66d8c2c-71a2-4927-a708-4b1412d0243c-kube-api-access-pl6zh" (OuterVolumeSpecName: "kube-api-access-pl6zh") pod "f66d8c2c-71a2-4927-a708-4b1412d0243c" (UID: "f66d8c2c-71a2-4927-a708-4b1412d0243c"). InnerVolumeSpecName "kube-api-access-pl6zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.410165 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f66d8c2c-71a2-4927-a708-4b1412d0243c-scripts" (OuterVolumeSpecName: "scripts") pod "f66d8c2c-71a2-4927-a708-4b1412d0243c" (UID: "f66d8c2c-71a2-4927-a708-4b1412d0243c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.415625 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f66d8c2c-71a2-4927-a708-4b1412d0243c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f66d8c2c-71a2-4927-a708-4b1412d0243c" (UID: "f66d8c2c-71a2-4927-a708-4b1412d0243c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.433615 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f66d8c2c-71a2-4927-a708-4b1412d0243c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f66d8c2c-71a2-4927-a708-4b1412d0243c" (UID: "f66d8c2c-71a2-4927-a708-4b1412d0243c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.492702 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ae264c3-b8e8-432e-9ee8-8ff04e30aac5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.492917 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm4nn\" (UniqueName: \"kubernetes.io/projected/9ae264c3-b8e8-432e-9ee8-8ff04e30aac5-kube-api-access-rm4nn\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.493008 4841 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f66d8c2c-71a2-4927-a708-4b1412d0243c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.493073 4841 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f66d8c2c-71a2-4927-a708-4b1412d0243c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.493127 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66d8c2c-71a2-4927-a708-4b1412d0243c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.493190 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f18b3ca4-294d-46f3-8b5b-c5e297bf58fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.493244 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcs55\" (UniqueName: \"kubernetes.io/projected/f18b3ca4-294d-46f3-8b5b-c5e297bf58fe-kube-api-access-qcs55\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.493362 4841 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f66d8c2c-71a2-4927-a708-4b1412d0243c-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.493433 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl6zh\" (UniqueName: \"kubernetes.io/projected/f66d8c2c-71a2-4927-a708-4b1412d0243c-kube-api-access-pl6zh\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.493501 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f66d8c2c-71a2-4927-a708-4b1412d0243c-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.493561 4841 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f66d8c2c-71a2-4927-a708-4b1412d0243c-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.833030 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-sqrfd" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.833039 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-sqrfd" event={"ID":"f66d8c2c-71a2-4927-a708-4b1412d0243c","Type":"ContainerDied","Data":"0426ee793b4d9c6ff6c8d7390e3559713c9bd3cadcf72fb0446951bb98b6e7aa"} Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.833462 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0426ee793b4d9c6ff6c8d7390e3559713c9bd3cadcf72fb0446951bb98b6e7aa" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.835291 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-78p58" event={"ID":"9ae264c3-b8e8-432e-9ee8-8ff04e30aac5","Type":"ContainerDied","Data":"fb6ed82fcf54009db3690187e56479b030a5352a86dbfafbb9d79f5fbdb8f635"} Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.835325 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb6ed82fcf54009db3690187e56479b030a5352a86dbfafbb9d79f5fbdb8f635" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.835623 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-78p58" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.840779 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5e0a-account-create-update-tk6x7" event={"ID":"f18b3ca4-294d-46f3-8b5b-c5e297bf58fe","Type":"ContainerDied","Data":"929d8ef31effd060921a74a27cabbc595b32fac0c42ff9ce17d5a9ccbac54a42"} Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.840842 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="929d8ef31effd060921a74a27cabbc595b32fac0c42ff9ce17d5a9ccbac54a42" Mar 13 09:30:55 crc kubenswrapper[4841]: I0313 09:30:55.840854 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5e0a-account-create-update-tk6x7" Mar 13 09:30:56 crc kubenswrapper[4841]: I0313 09:30:56.180495 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.081414 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-b95pq"] Mar 13 09:30:57 crc kubenswrapper[4841]: E0313 09:30:57.081778 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78fb8b1f-a395-4430-a2db-267939774965" containerName="mariadb-account-create-update" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.081793 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="78fb8b1f-a395-4430-a2db-267939774965" containerName="mariadb-account-create-update" Mar 13 09:30:57 crc kubenswrapper[4841]: E0313 09:30:57.081814 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae264c3-b8e8-432e-9ee8-8ff04e30aac5" containerName="mariadb-database-create" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.081821 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae264c3-b8e8-432e-9ee8-8ff04e30aac5" containerName="mariadb-database-create" Mar 13 09:30:57 crc kubenswrapper[4841]: E0313 09:30:57.081851 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f66d8c2c-71a2-4927-a708-4b1412d0243c" containerName="swift-ring-rebalance" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.081858 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f66d8c2c-71a2-4927-a708-4b1412d0243c" containerName="swift-ring-rebalance" Mar 13 09:30:57 crc kubenswrapper[4841]: E0313 09:30:57.081869 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f18b3ca4-294d-46f3-8b5b-c5e297bf58fe" containerName="mariadb-account-create-update" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.081876 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f18b3ca4-294d-46f3-8b5b-c5e297bf58fe" containerName="mariadb-account-create-update" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.082051 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ae264c3-b8e8-432e-9ee8-8ff04e30aac5" containerName="mariadb-database-create" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.082072 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="78fb8b1f-a395-4430-a2db-267939774965" containerName="mariadb-account-create-update" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.082081 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f66d8c2c-71a2-4927-a708-4b1412d0243c" containerName="swift-ring-rebalance" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.082096 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f18b3ca4-294d-46f3-8b5b-c5e297bf58fe" containerName="mariadb-account-create-update" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.082700 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b95pq" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.085899 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-nwmvp" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.086027 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.099623 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-b95pq"] Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.224482 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef41142-9432-4e66-9008-a3c1ff35e9a8-config-data\") pod \"glance-db-sync-b95pq\" (UID: \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\") " pod="openstack/glance-db-sync-b95pq" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.224559 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ef41142-9432-4e66-9008-a3c1ff35e9a8-db-sync-config-data\") pod \"glance-db-sync-b95pq\" (UID: \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\") " pod="openstack/glance-db-sync-b95pq" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.224650 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ef41142-9432-4e66-9008-a3c1ff35e9a8-combined-ca-bundle\") pod \"glance-db-sync-b95pq\" (UID: \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\") " pod="openstack/glance-db-sync-b95pq" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.224749 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pq6s\" (UniqueName: \"kubernetes.io/projected/0ef41142-9432-4e66-9008-a3c1ff35e9a8-kube-api-access-4pq6s\") pod \"glance-db-sync-b95pq\" (UID: \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\") " pod="openstack/glance-db-sync-b95pq" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.326216 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pq6s\" (UniqueName: \"kubernetes.io/projected/0ef41142-9432-4e66-9008-a3c1ff35e9a8-kube-api-access-4pq6s\") pod \"glance-db-sync-b95pq\" (UID: \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\") " pod="openstack/glance-db-sync-b95pq" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.326765 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef41142-9432-4e66-9008-a3c1ff35e9a8-config-data\") pod \"glance-db-sync-b95pq\" (UID: \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\") " pod="openstack/glance-db-sync-b95pq" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.326976 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ef41142-9432-4e66-9008-a3c1ff35e9a8-db-sync-config-data\") pod \"glance-db-sync-b95pq\" (UID: \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\") " pod="openstack/glance-db-sync-b95pq" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.327235 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ef41142-9432-4e66-9008-a3c1ff35e9a8-combined-ca-bundle\") pod \"glance-db-sync-b95pq\" (UID: \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\") " pod="openstack/glance-db-sync-b95pq" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.331858 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ef41142-9432-4e66-9008-a3c1ff35e9a8-combined-ca-bundle\") pod \"glance-db-sync-b95pq\" (UID: \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\") " pod="openstack/glance-db-sync-b95pq" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.339211 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef41142-9432-4e66-9008-a3c1ff35e9a8-config-data\") pod \"glance-db-sync-b95pq\" (UID: \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\") " pod="openstack/glance-db-sync-b95pq" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.353367 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ef41142-9432-4e66-9008-a3c1ff35e9a8-db-sync-config-data\") pod \"glance-db-sync-b95pq\" (UID: \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\") " pod="openstack/glance-db-sync-b95pq" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.358694 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pq6s\" (UniqueName: \"kubernetes.io/projected/0ef41142-9432-4e66-9008-a3c1ff35e9a8-kube-api-access-4pq6s\") pod \"glance-db-sync-b95pq\" (UID: \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\") " pod="openstack/glance-db-sync-b95pq" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.403020 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b95pq" Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.875086 4841 generic.go:334] "Generic (PLEG): container finished" podID="6f270332-4a01-403b-8c06-0f8c0bff6527" containerID="18b82d7d667bf5f2cb4a38afb34a41c960cfe5ff6e84964c33a38c6b1d742611" exitCode=0 Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.875147 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6f270332-4a01-403b-8c06-0f8c0bff6527","Type":"ContainerDied","Data":"18b82d7d667bf5f2cb4a38afb34a41c960cfe5ff6e84964c33a38c6b1d742611"} Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.877374 4841 generic.go:334] "Generic (PLEG): container finished" podID="ea6882c8-841d-4ca7-90a9-3d16c4303a58" containerID="723c75ed37f7f534e4700be3916499abdbad4a5a30193aa4fdacace8cd9cf2e3" exitCode=0 Mar 13 09:30:57 crc kubenswrapper[4841]: I0313 09:30:57.877400 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ea6882c8-841d-4ca7-90a9-3d16c4303a58","Type":"ContainerDied","Data":"723c75ed37f7f534e4700be3916499abdbad4a5a30193aa4fdacace8cd9cf2e3"} Mar 13 09:30:58 crc kubenswrapper[4841]: I0313 09:30:58.064573 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-b95pq"] Mar 13 09:30:58 crc kubenswrapper[4841]: W0313 09:30:58.070651 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ef41142_9432_4e66_9008_a3c1ff35e9a8.slice/crio-4814b6aa18d375627fddbb60174ecf7af831e69fa0e8f573cef4e504880f98f2 WatchSource:0}: Error finding container 4814b6aa18d375627fddbb60174ecf7af831e69fa0e8f573cef4e504880f98f2: Status 404 returned error can't find the container with id 4814b6aa18d375627fddbb60174ecf7af831e69fa0e8f573cef4e504880f98f2 Mar 13 09:30:58 crc kubenswrapper[4841]: I0313 09:30:58.885916 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ea6882c8-841d-4ca7-90a9-3d16c4303a58","Type":"ContainerStarted","Data":"d484f76415e52edb8485e15e2d614b7c52bd0a121cded2bea9a89604a513cb21"} Mar 13 09:30:58 crc kubenswrapper[4841]: I0313 09:30:58.886142 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:30:58 crc kubenswrapper[4841]: I0313 09:30:58.888031 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b95pq" event={"ID":"0ef41142-9432-4e66-9008-a3c1ff35e9a8","Type":"ContainerStarted","Data":"4814b6aa18d375627fddbb60174ecf7af831e69fa0e8f573cef4e504880f98f2"} Mar 13 09:30:58 crc kubenswrapper[4841]: I0313 09:30:58.890708 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6f270332-4a01-403b-8c06-0f8c0bff6527","Type":"ContainerStarted","Data":"8eb6ff886c84daefc19ad53048a04087959845a3c06e98ed0685d2cf5ed764ab"} Mar 13 09:30:58 crc kubenswrapper[4841]: I0313 09:30:58.890890 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 09:30:58 crc kubenswrapper[4841]: I0313 09:30:58.907251 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=51.791504675 podStartE2EDuration="59.907231697s" podCreationTimestamp="2026-03-13 09:29:59 +0000 UTC" firstStartedPulling="2026-03-13 09:30:14.400690159 +0000 UTC m=+1097.130590350" lastFinishedPulling="2026-03-13 09:30:22.516417181 +0000 UTC m=+1105.246317372" observedRunningTime="2026-03-13 09:30:58.903607186 +0000 UTC m=+1141.633507397" watchObservedRunningTime="2026-03-13 09:30:58.907231697 +0000 UTC m=+1141.637131888" Mar 13 09:30:58 crc kubenswrapper[4841]: I0313 09:30:58.931647 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=51.382221289 podStartE2EDuration="59.931624763s" podCreationTimestamp="2026-03-13 09:29:59 +0000 UTC" firstStartedPulling="2026-03-13 09:30:14.660223606 +0000 UTC m=+1097.390123797" lastFinishedPulling="2026-03-13 09:30:23.20962706 +0000 UTC m=+1105.939527271" observedRunningTime="2026-03-13 09:30:58.925526496 +0000 UTC m=+1141.655426687" watchObservedRunningTime="2026-03-13 09:30:58.931624763 +0000 UTC m=+1141.661524954" Mar 13 09:30:58 crc kubenswrapper[4841]: I0313 09:30:58.986841 4841 scope.go:117] "RemoveContainer" containerID="79aa8c15393be9c9864827208121f40731484becd9d763c2e0cd89268204c3d3" Mar 13 09:30:59 crc kubenswrapper[4841]: I0313 09:30:59.023863 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-bqlfl" podUID="b2bf634d-aa4f-4773-91ee-99616e217c82" containerName="ovn-controller" probeResult="failure" output=< Mar 13 09:30:59 crc kubenswrapper[4841]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 09:30:59 crc kubenswrapper[4841]: > Mar 13 09:30:59 crc kubenswrapper[4841]: I0313 09:30:59.041986 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.037060 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-bqlfl" podUID="b2bf634d-aa4f-4773-91ee-99616e217c82" containerName="ovn-controller" probeResult="failure" output=< Mar 13 09:31:04 crc kubenswrapper[4841]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 09:31:04 crc kubenswrapper[4841]: > Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.066515 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-b2w62" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.264938 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-bqlfl-config-xzzpr"] Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.265954 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.267987 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.283960 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bqlfl-config-xzzpr"] Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.360093 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-additional-scripts\") pod \"ovn-controller-bqlfl-config-xzzpr\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.360163 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jmmr\" (UniqueName: \"kubernetes.io/projected/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-kube-api-access-7jmmr\") pod \"ovn-controller-bqlfl-config-xzzpr\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.360223 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-scripts\") pod \"ovn-controller-bqlfl-config-xzzpr\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.360292 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-var-run\") pod \"ovn-controller-bqlfl-config-xzzpr\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.360310 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-var-run-ovn\") pod \"ovn-controller-bqlfl-config-xzzpr\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.360325 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-var-log-ovn\") pod \"ovn-controller-bqlfl-config-xzzpr\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.461582 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jmmr\" (UniqueName: \"kubernetes.io/projected/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-kube-api-access-7jmmr\") pod \"ovn-controller-bqlfl-config-xzzpr\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.461656 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-scripts\") pod \"ovn-controller-bqlfl-config-xzzpr\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.461713 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-var-run\") pod \"ovn-controller-bqlfl-config-xzzpr\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.461731 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-var-run-ovn\") pod \"ovn-controller-bqlfl-config-xzzpr\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.461745 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-var-log-ovn\") pod \"ovn-controller-bqlfl-config-xzzpr\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.461794 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-additional-scripts\") pod \"ovn-controller-bqlfl-config-xzzpr\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.462294 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-var-run\") pod \"ovn-controller-bqlfl-config-xzzpr\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.462326 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-var-log-ovn\") pod \"ovn-controller-bqlfl-config-xzzpr\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.462348 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-additional-scripts\") pod \"ovn-controller-bqlfl-config-xzzpr\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.462376 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-var-run-ovn\") pod \"ovn-controller-bqlfl-config-xzzpr\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.463822 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-scripts\") pod \"ovn-controller-bqlfl-config-xzzpr\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.484229 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jmmr\" (UniqueName: \"kubernetes.io/projected/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-kube-api-access-7jmmr\") pod \"ovn-controller-bqlfl-config-xzzpr\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:04 crc kubenswrapper[4841]: I0313 09:31:04.583864 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:09 crc kubenswrapper[4841]: I0313 09:31:09.026445 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-bqlfl" podUID="b2bf634d-aa4f-4773-91ee-99616e217c82" containerName="ovn-controller" probeResult="failure" output=< Mar 13 09:31:09 crc kubenswrapper[4841]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 09:31:09 crc kubenswrapper[4841]: > Mar 13 09:31:09 crc kubenswrapper[4841]: I0313 09:31:09.646506 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:31:09 crc kubenswrapper[4841]: I0313 09:31:09.654946 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64f62ec7-7a91-458c-86cb-7658544e4a51-etc-swift\") pod \"swift-storage-0\" (UID: \"64f62ec7-7a91-458c-86cb-7658544e4a51\") " pod="openstack/swift-storage-0" Mar 13 09:31:09 crc kubenswrapper[4841]: I0313 09:31:09.826399 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 09:31:10 crc kubenswrapper[4841]: I0313 09:31:10.218558 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bqlfl-config-xzzpr"] Mar 13 09:31:10 crc kubenswrapper[4841]: W0313 09:31:10.232028 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda61a9b08_e073_439c_9c7f_112ddf7d6fb1.slice/crio-f6616beb912c19321474c4673866f7f9ccadaad42168aa316be51e9703b84035 WatchSource:0}: Error finding container f6616beb912c19321474c4673866f7f9ccadaad42168aa316be51e9703b84035: Status 404 returned error can't find the container with id f6616beb912c19321474c4673866f7f9ccadaad42168aa316be51e9703b84035 Mar 13 09:31:10 crc kubenswrapper[4841]: I0313 09:31:10.567480 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:31:10 crc kubenswrapper[4841]: I0313 09:31:10.597893 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 09:31:10 crc kubenswrapper[4841]: I0313 09:31:10.824465 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 09:31:11 crc kubenswrapper[4841]: I0313 09:31:11.011598 4841 generic.go:334] "Generic (PLEG): container finished" podID="a61a9b08-e073-439c-9c7f-112ddf7d6fb1" containerID="284cd8c49caa3ae8b3d51e24c6a7d55ff566d1bfba2a5b7b1bcfd1c27bcac6fa" exitCode=0 Mar 13 09:31:11 crc kubenswrapper[4841]: I0313 09:31:11.011954 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bqlfl-config-xzzpr" event={"ID":"a61a9b08-e073-439c-9c7f-112ddf7d6fb1","Type":"ContainerDied","Data":"284cd8c49caa3ae8b3d51e24c6a7d55ff566d1bfba2a5b7b1bcfd1c27bcac6fa"} Mar 13 09:31:11 crc kubenswrapper[4841]: I0313 09:31:11.012053 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bqlfl-config-xzzpr" event={"ID":"a61a9b08-e073-439c-9c7f-112ddf7d6fb1","Type":"ContainerStarted","Data":"f6616beb912c19321474c4673866f7f9ccadaad42168aa316be51e9703b84035"} Mar 13 09:31:11 crc kubenswrapper[4841]: I0313 09:31:11.013693 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b95pq" event={"ID":"0ef41142-9432-4e66-9008-a3c1ff35e9a8","Type":"ContainerStarted","Data":"bc99b53eeb38394454c318b0a27e8e101d60e7d5e85125e266c1b816cf338bca"} Mar 13 09:31:11 crc kubenswrapper[4841]: I0313 09:31:11.015715 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64f62ec7-7a91-458c-86cb-7658544e4a51","Type":"ContainerStarted","Data":"109990b1917f98fd0290120b58436d1236e2cab11b68ce76ea55f022ffc9e01a"} Mar 13 09:31:11 crc kubenswrapper[4841]: I0313 09:31:11.061699 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-b95pq" podStartSLOduration=2.019422736 podStartE2EDuration="14.061679884s" podCreationTimestamp="2026-03-13 09:30:57 +0000 UTC" firstStartedPulling="2026-03-13 09:30:58.072137411 +0000 UTC m=+1140.802037602" lastFinishedPulling="2026-03-13 09:31:10.114394559 +0000 UTC m=+1152.844294750" observedRunningTime="2026-03-13 09:31:11.056992331 +0000 UTC m=+1153.786892532" watchObservedRunningTime="2026-03-13 09:31:11.061679884 +0000 UTC m=+1153.791580075" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.038504 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64f62ec7-7a91-458c-86cb-7658544e4a51","Type":"ContainerStarted","Data":"6829dbf0dba430788d24acb49db93947b2d89ebabf1753de70ccd1ee49af4c99"} Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.343703 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-cd2cd"] Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.345026 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-cd2cd" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.357199 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-cd2cd"] Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.417637 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8-operator-scripts\") pod \"heat-db-create-cd2cd\" (UID: \"3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8\") " pod="openstack/heat-db-create-cd2cd" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.417712 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgqnt\" (UniqueName: \"kubernetes.io/projected/3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8-kube-api-access-qgqnt\") pod \"heat-db-create-cd2cd\" (UID: \"3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8\") " pod="openstack/heat-db-create-cd2cd" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.441555 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.519174 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-additional-scripts\") pod \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.519233 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-var-run\") pod \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.519277 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-scripts\") pod \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.519381 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jmmr\" (UniqueName: \"kubernetes.io/projected/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-kube-api-access-7jmmr\") pod \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.519436 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-var-run-ovn\") pod \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.519474 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-var-log-ovn\") pod \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\" (UID: \"a61a9b08-e073-439c-9c7f-112ddf7d6fb1\") " Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.519627 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8-operator-scripts\") pod \"heat-db-create-cd2cd\" (UID: \"3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8\") " pod="openstack/heat-db-create-cd2cd" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.519672 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgqnt\" (UniqueName: \"kubernetes.io/projected/3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8-kube-api-access-qgqnt\") pod \"heat-db-create-cd2cd\" (UID: \"3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8\") " pod="openstack/heat-db-create-cd2cd" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.520567 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a61a9b08-e073-439c-9c7f-112ddf7d6fb1" (UID: "a61a9b08-e073-439c-9c7f-112ddf7d6fb1"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.520778 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a61a9b08-e073-439c-9c7f-112ddf7d6fb1" (UID: "a61a9b08-e073-439c-9c7f-112ddf7d6fb1"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.520795 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a61a9b08-e073-439c-9c7f-112ddf7d6fb1" (UID: "a61a9b08-e073-439c-9c7f-112ddf7d6fb1"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.521338 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8-operator-scripts\") pod \"heat-db-create-cd2cd\" (UID: \"3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8\") " pod="openstack/heat-db-create-cd2cd" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.521354 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-var-run" (OuterVolumeSpecName: "var-run") pod "a61a9b08-e073-439c-9c7f-112ddf7d6fb1" (UID: "a61a9b08-e073-439c-9c7f-112ddf7d6fb1"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.522215 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-a114-account-create-update-cx7pv"] Mar 13 09:31:12 crc kubenswrapper[4841]: E0313 09:31:12.522613 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a61a9b08-e073-439c-9c7f-112ddf7d6fb1" containerName="ovn-config" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.522633 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a61a9b08-e073-439c-9c7f-112ddf7d6fb1" containerName="ovn-config" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.522603 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-scripts" (OuterVolumeSpecName: "scripts") pod "a61a9b08-e073-439c-9c7f-112ddf7d6fb1" (UID: "a61a9b08-e073-439c-9c7f-112ddf7d6fb1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.522813 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-kube-api-access-7jmmr" (OuterVolumeSpecName: "kube-api-access-7jmmr") pod "a61a9b08-e073-439c-9c7f-112ddf7d6fb1" (UID: "a61a9b08-e073-439c-9c7f-112ddf7d6fb1"). InnerVolumeSpecName "kube-api-access-7jmmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.522849 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a61a9b08-e073-439c-9c7f-112ddf7d6fb1" containerName="ovn-config" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.523521 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a114-account-create-update-cx7pv" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.581041 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.585971 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgqnt\" (UniqueName: \"kubernetes.io/projected/3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8-kube-api-access-qgqnt\") pod \"heat-db-create-cd2cd\" (UID: \"3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8\") " pod="openstack/heat-db-create-cd2cd" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.609816 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-a114-account-create-update-cx7pv"] Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.621321 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b564622-1b7c-4e13-9126-e68d7b0ad6fa-operator-scripts\") pod \"heat-a114-account-create-update-cx7pv\" (UID: \"7b564622-1b7c-4e13-9126-e68d7b0ad6fa\") " pod="openstack/heat-a114-account-create-update-cx7pv" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.621606 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft9hj\" (UniqueName: \"kubernetes.io/projected/7b564622-1b7c-4e13-9126-e68d7b0ad6fa-kube-api-access-ft9hj\") pod \"heat-a114-account-create-update-cx7pv\" (UID: \"7b564622-1b7c-4e13-9126-e68d7b0ad6fa\") " pod="openstack/heat-a114-account-create-update-cx7pv" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.621776 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jmmr\" (UniqueName: \"kubernetes.io/projected/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-kube-api-access-7jmmr\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.621841 4841 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.621900 4841 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.621956 4841 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.622014 4841 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.622074 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a61a9b08-e073-439c-9c7f-112ddf7d6fb1-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.713858 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-cd2cd" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.716958 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-j4ks6"] Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.717968 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-j4ks6" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.723442 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b564622-1b7c-4e13-9126-e68d7b0ad6fa-operator-scripts\") pod \"heat-a114-account-create-update-cx7pv\" (UID: \"7b564622-1b7c-4e13-9126-e68d7b0ad6fa\") " pod="openstack/heat-a114-account-create-update-cx7pv" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.723500 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft9hj\" (UniqueName: \"kubernetes.io/projected/7b564622-1b7c-4e13-9126-e68d7b0ad6fa-kube-api-access-ft9hj\") pod \"heat-a114-account-create-update-cx7pv\" (UID: \"7b564622-1b7c-4e13-9126-e68d7b0ad6fa\") " pod="openstack/heat-a114-account-create-update-cx7pv" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.724085 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b564622-1b7c-4e13-9126-e68d7b0ad6fa-operator-scripts\") pod \"heat-a114-account-create-update-cx7pv\" (UID: \"7b564622-1b7c-4e13-9126-e68d7b0ad6fa\") " pod="openstack/heat-a114-account-create-update-cx7pv" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.735586 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-j4ks6"] Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.748141 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft9hj\" (UniqueName: \"kubernetes.io/projected/7b564622-1b7c-4e13-9126-e68d7b0ad6fa-kube-api-access-ft9hj\") pod \"heat-a114-account-create-update-cx7pv\" (UID: \"7b564622-1b7c-4e13-9126-e68d7b0ad6fa\") " pod="openstack/heat-a114-account-create-update-cx7pv" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.788365 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-4kxn8"] Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.789329 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4kxn8" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.794993 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.795155 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.795307 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-88dmb" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.795399 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.803276 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4kxn8"] Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.858174 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvlfn\" (UniqueName: \"kubernetes.io/projected/4000c1ec-fd5e-4449-be32-cc39edbf5d10-kube-api-access-mvlfn\") pod \"keystone-db-sync-4kxn8\" (UID: \"4000c1ec-fd5e-4449-be32-cc39edbf5d10\") " pod="openstack/keystone-db-sync-4kxn8" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.858309 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4000c1ec-fd5e-4449-be32-cc39edbf5d10-config-data\") pod \"keystone-db-sync-4kxn8\" (UID: \"4000c1ec-fd5e-4449-be32-cc39edbf5d10\") " pod="openstack/keystone-db-sync-4kxn8" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.858463 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d484\" (UniqueName: \"kubernetes.io/projected/de1be705-392c-4454-81e1-2267d10d1535-kube-api-access-6d484\") pod \"cinder-db-create-j4ks6\" (UID: \"de1be705-392c-4454-81e1-2267d10d1535\") " pod="openstack/cinder-db-create-j4ks6" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.858505 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4000c1ec-fd5e-4449-be32-cc39edbf5d10-combined-ca-bundle\") pod \"keystone-db-sync-4kxn8\" (UID: \"4000c1ec-fd5e-4449-be32-cc39edbf5d10\") " pod="openstack/keystone-db-sync-4kxn8" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.858556 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de1be705-392c-4454-81e1-2267d10d1535-operator-scripts\") pod \"cinder-db-create-j4ks6\" (UID: \"de1be705-392c-4454-81e1-2267d10d1535\") " pod="openstack/cinder-db-create-j4ks6" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.902895 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-1811-account-create-update-cx59z"] Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.904159 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1811-account-create-update-cx59z" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.910852 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.911380 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a114-account-create-update-cx7pv" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.920720 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1811-account-create-update-cx59z"] Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.943475 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-wkjjs"] Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.944466 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wkjjs" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.959911 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d484\" (UniqueName: \"kubernetes.io/projected/de1be705-392c-4454-81e1-2267d10d1535-kube-api-access-6d484\") pod \"cinder-db-create-j4ks6\" (UID: \"de1be705-392c-4454-81e1-2267d10d1535\") " pod="openstack/cinder-db-create-j4ks6" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.959952 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4000c1ec-fd5e-4449-be32-cc39edbf5d10-combined-ca-bundle\") pod \"keystone-db-sync-4kxn8\" (UID: \"4000c1ec-fd5e-4449-be32-cc39edbf5d10\") " pod="openstack/keystone-db-sync-4kxn8" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.959980 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgz85\" (UniqueName: \"kubernetes.io/projected/ca7a566b-2ed4-4044-a243-2074a5dcad72-kube-api-access-zgz85\") pod \"barbican-1811-account-create-update-cx59z\" (UID: \"ca7a566b-2ed4-4044-a243-2074a5dcad72\") " pod="openstack/barbican-1811-account-create-update-cx59z" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.960003 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de1be705-392c-4454-81e1-2267d10d1535-operator-scripts\") pod \"cinder-db-create-j4ks6\" (UID: \"de1be705-392c-4454-81e1-2267d10d1535\") " pod="openstack/cinder-db-create-j4ks6" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.960028 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvlfn\" (UniqueName: \"kubernetes.io/projected/4000c1ec-fd5e-4449-be32-cc39edbf5d10-kube-api-access-mvlfn\") pod \"keystone-db-sync-4kxn8\" (UID: \"4000c1ec-fd5e-4449-be32-cc39edbf5d10\") " pod="openstack/keystone-db-sync-4kxn8" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.960087 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4000c1ec-fd5e-4449-be32-cc39edbf5d10-config-data\") pod \"keystone-db-sync-4kxn8\" (UID: \"4000c1ec-fd5e-4449-be32-cc39edbf5d10\") " pod="openstack/keystone-db-sync-4kxn8" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.960105 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca7a566b-2ed4-4044-a243-2074a5dcad72-operator-scripts\") pod \"barbican-1811-account-create-update-cx59z\" (UID: \"ca7a566b-2ed4-4044-a243-2074a5dcad72\") " pod="openstack/barbican-1811-account-create-update-cx59z" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.960983 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de1be705-392c-4454-81e1-2267d10d1535-operator-scripts\") pod \"cinder-db-create-j4ks6\" (UID: \"de1be705-392c-4454-81e1-2267d10d1535\") " pod="openstack/cinder-db-create-j4ks6" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.965526 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4000c1ec-fd5e-4449-be32-cc39edbf5d10-combined-ca-bundle\") pod \"keystone-db-sync-4kxn8\" (UID: \"4000c1ec-fd5e-4449-be32-cc39edbf5d10\") " pod="openstack/keystone-db-sync-4kxn8" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.967676 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4000c1ec-fd5e-4449-be32-cc39edbf5d10-config-data\") pod \"keystone-db-sync-4kxn8\" (UID: \"4000c1ec-fd5e-4449-be32-cc39edbf5d10\") " pod="openstack/keystone-db-sync-4kxn8" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.975314 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wkjjs"] Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.982358 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvlfn\" (UniqueName: \"kubernetes.io/projected/4000c1ec-fd5e-4449-be32-cc39edbf5d10-kube-api-access-mvlfn\") pod \"keystone-db-sync-4kxn8\" (UID: \"4000c1ec-fd5e-4449-be32-cc39edbf5d10\") " pod="openstack/keystone-db-sync-4kxn8" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.987277 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d484\" (UniqueName: \"kubernetes.io/projected/de1be705-392c-4454-81e1-2267d10d1535-kube-api-access-6d484\") pod \"cinder-db-create-j4ks6\" (UID: \"de1be705-392c-4454-81e1-2267d10d1535\") " pod="openstack/cinder-db-create-j4ks6" Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.994325 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-0830-account-create-update-8wtfg"] Mar 13 09:31:12 crc kubenswrapper[4841]: I0313 09:31:12.995631 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0830-account-create-update-8wtfg" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.014695 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.035857 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-j4ks6" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.063114 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9834c195-9fa4-4052-b502-85d9992415c5-operator-scripts\") pod \"neutron-0830-account-create-update-8wtfg\" (UID: \"9834c195-9fa4-4052-b502-85d9992415c5\") " pod="openstack/neutron-0830-account-create-update-8wtfg" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.063469 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkw6f\" (UniqueName: \"kubernetes.io/projected/2448d2f1-6d19-4ef7-8df3-afab14941187-kube-api-access-pkw6f\") pod \"barbican-db-create-wkjjs\" (UID: \"2448d2f1-6d19-4ef7-8df3-afab14941187\") " pod="openstack/barbican-db-create-wkjjs" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.063520 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca7a566b-2ed4-4044-a243-2074a5dcad72-operator-scripts\") pod \"barbican-1811-account-create-update-cx59z\" (UID: \"ca7a566b-2ed4-4044-a243-2074a5dcad72\") " pod="openstack/barbican-1811-account-create-update-cx59z" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.063579 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2448d2f1-6d19-4ef7-8df3-afab14941187-operator-scripts\") pod \"barbican-db-create-wkjjs\" (UID: \"2448d2f1-6d19-4ef7-8df3-afab14941187\") " pod="openstack/barbican-db-create-wkjjs" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.063618 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqsmw\" (UniqueName: \"kubernetes.io/projected/9834c195-9fa4-4052-b502-85d9992415c5-kube-api-access-qqsmw\") pod \"neutron-0830-account-create-update-8wtfg\" (UID: \"9834c195-9fa4-4052-b502-85d9992415c5\") " pod="openstack/neutron-0830-account-create-update-8wtfg" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.063659 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgz85\" (UniqueName: \"kubernetes.io/projected/ca7a566b-2ed4-4044-a243-2074a5dcad72-kube-api-access-zgz85\") pod \"barbican-1811-account-create-update-cx59z\" (UID: \"ca7a566b-2ed4-4044-a243-2074a5dcad72\") " pod="openstack/barbican-1811-account-create-update-cx59z" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.064628 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca7a566b-2ed4-4044-a243-2074a5dcad72-operator-scripts\") pod \"barbican-1811-account-create-update-cx59z\" (UID: \"ca7a566b-2ed4-4044-a243-2074a5dcad72\") " pod="openstack/barbican-1811-account-create-update-cx59z" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.079780 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0830-account-create-update-8wtfg"] Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.101749 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64f62ec7-7a91-458c-86cb-7658544e4a51","Type":"ContainerStarted","Data":"50d33ba71b13a6cbeacc7228ad19105ccac9770b9ea50f0d0ea44476a784ced5"} Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.101803 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64f62ec7-7a91-458c-86cb-7658544e4a51","Type":"ContainerStarted","Data":"9ae6d300fb815826a93ce67a1565545418153f082214da7b1ac067c0e55f7c4b"} Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.101811 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64f62ec7-7a91-458c-86cb-7658544e4a51","Type":"ContainerStarted","Data":"a12c12dc898b0160299aeb5b202cf424feabf480a1bbe065482c9e9e137510f4"} Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.103905 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgz85\" (UniqueName: \"kubernetes.io/projected/ca7a566b-2ed4-4044-a243-2074a5dcad72-kube-api-access-zgz85\") pod \"barbican-1811-account-create-update-cx59z\" (UID: \"ca7a566b-2ed4-4044-a243-2074a5dcad72\") " pod="openstack/barbican-1811-account-create-update-cx59z" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.104989 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bqlfl-config-xzzpr" event={"ID":"a61a9b08-e073-439c-9c7f-112ddf7d6fb1","Type":"ContainerDied","Data":"f6616beb912c19321474c4673866f7f9ccadaad42168aa316be51e9703b84035"} Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.105021 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6616beb912c19321474c4673866f7f9ccadaad42168aa316be51e9703b84035" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.105084 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bqlfl-config-xzzpr" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.171720 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9834c195-9fa4-4052-b502-85d9992415c5-operator-scripts\") pod \"neutron-0830-account-create-update-8wtfg\" (UID: \"9834c195-9fa4-4052-b502-85d9992415c5\") " pod="openstack/neutron-0830-account-create-update-8wtfg" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.171777 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkw6f\" (UniqueName: \"kubernetes.io/projected/2448d2f1-6d19-4ef7-8df3-afab14941187-kube-api-access-pkw6f\") pod \"barbican-db-create-wkjjs\" (UID: \"2448d2f1-6d19-4ef7-8df3-afab14941187\") " pod="openstack/barbican-db-create-wkjjs" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.171834 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2448d2f1-6d19-4ef7-8df3-afab14941187-operator-scripts\") pod \"barbican-db-create-wkjjs\" (UID: \"2448d2f1-6d19-4ef7-8df3-afab14941187\") " pod="openstack/barbican-db-create-wkjjs" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.171863 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqsmw\" (UniqueName: \"kubernetes.io/projected/9834c195-9fa4-4052-b502-85d9992415c5-kube-api-access-qqsmw\") pod \"neutron-0830-account-create-update-8wtfg\" (UID: \"9834c195-9fa4-4052-b502-85d9992415c5\") " pod="openstack/neutron-0830-account-create-update-8wtfg" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.172712 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9834c195-9fa4-4052-b502-85d9992415c5-operator-scripts\") pod \"neutron-0830-account-create-update-8wtfg\" (UID: \"9834c195-9fa4-4052-b502-85d9992415c5\") " pod="openstack/neutron-0830-account-create-update-8wtfg" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.173341 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2448d2f1-6d19-4ef7-8df3-afab14941187-operator-scripts\") pod \"barbican-db-create-wkjjs\" (UID: \"2448d2f1-6d19-4ef7-8df3-afab14941187\") " pod="openstack/barbican-db-create-wkjjs" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.186806 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3885-account-create-update-5rctz"] Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.187778 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3885-account-create-update-5rctz" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.189810 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.201285 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4kxn8" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.207184 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqsmw\" (UniqueName: \"kubernetes.io/projected/9834c195-9fa4-4052-b502-85d9992415c5-kube-api-access-qqsmw\") pod \"neutron-0830-account-create-update-8wtfg\" (UID: \"9834c195-9fa4-4052-b502-85d9992415c5\") " pod="openstack/neutron-0830-account-create-update-8wtfg" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.212193 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkw6f\" (UniqueName: \"kubernetes.io/projected/2448d2f1-6d19-4ef7-8df3-afab14941187-kube-api-access-pkw6f\") pod \"barbican-db-create-wkjjs\" (UID: \"2448d2f1-6d19-4ef7-8df3-afab14941187\") " pod="openstack/barbican-db-create-wkjjs" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.213882 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-xx9s5"] Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.215676 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xx9s5" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.244312 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xx9s5"] Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.252184 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3885-account-create-update-5rctz"] Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.257995 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1811-account-create-update-cx59z" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.275838 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwcxt\" (UniqueName: \"kubernetes.io/projected/1dc06182-654b-4744-ac91-42013c901989-kube-api-access-dwcxt\") pod \"neutron-db-create-xx9s5\" (UID: \"1dc06182-654b-4744-ac91-42013c901989\") " pod="openstack/neutron-db-create-xx9s5" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.276167 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdx9t\" (UniqueName: \"kubernetes.io/projected/54329c28-2ae6-4b02-8b91-b182ef4e0e23-kube-api-access-gdx9t\") pod \"cinder-3885-account-create-update-5rctz\" (UID: \"54329c28-2ae6-4b02-8b91-b182ef4e0e23\") " pod="openstack/cinder-3885-account-create-update-5rctz" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.276279 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54329c28-2ae6-4b02-8b91-b182ef4e0e23-operator-scripts\") pod \"cinder-3885-account-create-update-5rctz\" (UID: \"54329c28-2ae6-4b02-8b91-b182ef4e0e23\") " pod="openstack/cinder-3885-account-create-update-5rctz" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.276358 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dc06182-654b-4744-ac91-42013c901989-operator-scripts\") pod \"neutron-db-create-xx9s5\" (UID: \"1dc06182-654b-4744-ac91-42013c901989\") " pod="openstack/neutron-db-create-xx9s5" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.381408 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54329c28-2ae6-4b02-8b91-b182ef4e0e23-operator-scripts\") pod \"cinder-3885-account-create-update-5rctz\" (UID: \"54329c28-2ae6-4b02-8b91-b182ef4e0e23\") " pod="openstack/cinder-3885-account-create-update-5rctz" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.381699 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dc06182-654b-4744-ac91-42013c901989-operator-scripts\") pod \"neutron-db-create-xx9s5\" (UID: \"1dc06182-654b-4744-ac91-42013c901989\") " pod="openstack/neutron-db-create-xx9s5" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.381745 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwcxt\" (UniqueName: \"kubernetes.io/projected/1dc06182-654b-4744-ac91-42013c901989-kube-api-access-dwcxt\") pod \"neutron-db-create-xx9s5\" (UID: \"1dc06182-654b-4744-ac91-42013c901989\") " pod="openstack/neutron-db-create-xx9s5" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.381820 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdx9t\" (UniqueName: \"kubernetes.io/projected/54329c28-2ae6-4b02-8b91-b182ef4e0e23-kube-api-access-gdx9t\") pod \"cinder-3885-account-create-update-5rctz\" (UID: \"54329c28-2ae6-4b02-8b91-b182ef4e0e23\") " pod="openstack/cinder-3885-account-create-update-5rctz" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.382165 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54329c28-2ae6-4b02-8b91-b182ef4e0e23-operator-scripts\") pod \"cinder-3885-account-create-update-5rctz\" (UID: \"54329c28-2ae6-4b02-8b91-b182ef4e0e23\") " pod="openstack/cinder-3885-account-create-update-5rctz" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.382923 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dc06182-654b-4744-ac91-42013c901989-operator-scripts\") pod \"neutron-db-create-xx9s5\" (UID: \"1dc06182-654b-4744-ac91-42013c901989\") " pod="openstack/neutron-db-create-xx9s5" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.384614 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-cd2cd"] Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.397174 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wkjjs" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.402754 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdx9t\" (UniqueName: \"kubernetes.io/projected/54329c28-2ae6-4b02-8b91-b182ef4e0e23-kube-api-access-gdx9t\") pod \"cinder-3885-account-create-update-5rctz\" (UID: \"54329c28-2ae6-4b02-8b91-b182ef4e0e23\") " pod="openstack/cinder-3885-account-create-update-5rctz" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.403789 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwcxt\" (UniqueName: \"kubernetes.io/projected/1dc06182-654b-4744-ac91-42013c901989-kube-api-access-dwcxt\") pod \"neutron-db-create-xx9s5\" (UID: \"1dc06182-654b-4744-ac91-42013c901989\") " pod="openstack/neutron-db-create-xx9s5" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.412110 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0830-account-create-update-8wtfg" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.528286 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-a114-account-create-update-cx7pv"] Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.537849 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3885-account-create-update-5rctz" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.554919 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xx9s5" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.558706 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-bqlfl-config-xzzpr"] Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.579785 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-bqlfl-config-xzzpr"] Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.659619 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-j4ks6"] Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.670243 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-bqlfl-config-5rlj2"] Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.672602 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: W0313 09:31:13.677977 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde1be705_392c_4454_81e1_2267d10d1535.slice/crio-a2dfefc9fc7620cd8a132c3f6cde52b3b378753ecb34872ed7cea34c14dc883e WatchSource:0}: Error finding container a2dfefc9fc7620cd8a132c3f6cde52b3b378753ecb34872ed7cea34c14dc883e: Status 404 returned error can't find the container with id a2dfefc9fc7620cd8a132c3f6cde52b3b378753ecb34872ed7cea34c14dc883e Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.678616 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.683908 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bqlfl-config-5rlj2"] Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.809204 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-additional-scripts\") pod \"ovn-controller-bqlfl-config-5rlj2\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.809666 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-592cb\" (UniqueName: \"kubernetes.io/projected/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-kube-api-access-592cb\") pod \"ovn-controller-bqlfl-config-5rlj2\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.809751 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-var-log-ovn\") pod \"ovn-controller-bqlfl-config-5rlj2\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.809800 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-var-run-ovn\") pod \"ovn-controller-bqlfl-config-5rlj2\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.809829 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-var-run\") pod \"ovn-controller-bqlfl-config-5rlj2\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.810317 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-scripts\") pod \"ovn-controller-bqlfl-config-5rlj2\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.826634 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4kxn8"] Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.851112 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1811-account-create-update-cx59z"] Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.913509 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-592cb\" (UniqueName: \"kubernetes.io/projected/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-kube-api-access-592cb\") pod \"ovn-controller-bqlfl-config-5rlj2\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.913561 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-var-log-ovn\") pod \"ovn-controller-bqlfl-config-5rlj2\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.913590 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-var-run-ovn\") pod \"ovn-controller-bqlfl-config-5rlj2\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.913864 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-var-run-ovn\") pod \"ovn-controller-bqlfl-config-5rlj2\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.913864 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-var-log-ovn\") pod \"ovn-controller-bqlfl-config-5rlj2\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.913990 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-var-run\") pod \"ovn-controller-bqlfl-config-5rlj2\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.914066 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-scripts\") pod \"ovn-controller-bqlfl-config-5rlj2\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.914070 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-var-run\") pod \"ovn-controller-bqlfl-config-5rlj2\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.914117 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-additional-scripts\") pod \"ovn-controller-bqlfl-config-5rlj2\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.914891 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-additional-scripts\") pod \"ovn-controller-bqlfl-config-5rlj2\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: W0313 09:31:13.916303 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca7a566b_2ed4_4044_a243_2074a5dcad72.slice/crio-7cd12817f30013c0599b589ba2736ce032f44e3fee487aff81c1d6d126cb2dc9 WatchSource:0}: Error finding container 7cd12817f30013c0599b589ba2736ce032f44e3fee487aff81c1d6d126cb2dc9: Status 404 returned error can't find the container with id 7cd12817f30013c0599b589ba2736ce032f44e3fee487aff81c1d6d126cb2dc9 Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.917249 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-scripts\") pod \"ovn-controller-bqlfl-config-5rlj2\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.933262 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-592cb\" (UniqueName: \"kubernetes.io/projected/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-kube-api-access-592cb\") pod \"ovn-controller-bqlfl-config-5rlj2\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:13 crc kubenswrapper[4841]: I0313 09:31:13.999047 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:14 crc kubenswrapper[4841]: I0313 09:31:14.017140 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a61a9b08-e073-439c-9c7f-112ddf7d6fb1" path="/var/lib/kubelet/pods/a61a9b08-e073-439c-9c7f-112ddf7d6fb1/volumes" Mar 13 09:31:14 crc kubenswrapper[4841]: I0313 09:31:14.024219 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wkjjs"] Mar 13 09:31:14 crc kubenswrapper[4841]: I0313 09:31:14.042375 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xx9s5"] Mar 13 09:31:14 crc kubenswrapper[4841]: I0313 09:31:14.072724 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-bqlfl" Mar 13 09:31:14 crc kubenswrapper[4841]: I0313 09:31:14.124041 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-cd2cd" event={"ID":"3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8","Type":"ContainerStarted","Data":"d56b04b6f31f1967b9c274e0e5556e0c3845582d3fdda5a4c50fa9157ff947a3"} Mar 13 09:31:14 crc kubenswrapper[4841]: I0313 09:31:14.124096 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-cd2cd" event={"ID":"3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8","Type":"ContainerStarted","Data":"96560309e0e028a78b63423c82fc6bd091d546b9e5798d9002054d29f0cc2b58"} Mar 13 09:31:14 crc kubenswrapper[4841]: I0313 09:31:14.125222 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wkjjs" event={"ID":"2448d2f1-6d19-4ef7-8df3-afab14941187","Type":"ContainerStarted","Data":"ac63a773454d3c887a393ce5baa9ae804518c086a85d021e5f86713cd8619b88"} Mar 13 09:31:14 crc kubenswrapper[4841]: I0313 09:31:14.126207 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xx9s5" event={"ID":"1dc06182-654b-4744-ac91-42013c901989","Type":"ContainerStarted","Data":"e186462e76a1d4a1d707c07a0e573d4716d59a7881679908984845c6489c73a3"} Mar 13 09:31:14 crc kubenswrapper[4841]: I0313 09:31:14.128946 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4kxn8" event={"ID":"4000c1ec-fd5e-4449-be32-cc39edbf5d10","Type":"ContainerStarted","Data":"5c25437b713c81c8143248a8a46cda18fbf295c32d14609ad990f79d36e02eec"} Mar 13 09:31:14 crc kubenswrapper[4841]: I0313 09:31:14.130237 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a114-account-create-update-cx7pv" event={"ID":"7b564622-1b7c-4e13-9126-e68d7b0ad6fa","Type":"ContainerStarted","Data":"a3a424e41931e8019d71db683b8d0ef001fd2ba53258d630e509e80c9176d390"} Mar 13 09:31:14 crc kubenswrapper[4841]: I0313 09:31:14.135484 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-j4ks6" event={"ID":"de1be705-392c-4454-81e1-2267d10d1535","Type":"ContainerStarted","Data":"a2dfefc9fc7620cd8a132c3f6cde52b3b378753ecb34872ed7cea34c14dc883e"} Mar 13 09:31:14 crc kubenswrapper[4841]: I0313 09:31:14.136651 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1811-account-create-update-cx59z" event={"ID":"ca7a566b-2ed4-4044-a243-2074a5dcad72","Type":"ContainerStarted","Data":"7cd12817f30013c0599b589ba2736ce032f44e3fee487aff81c1d6d126cb2dc9"} Mar 13 09:31:14 crc kubenswrapper[4841]: W0313 09:31:14.196357 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9834c195_9fa4_4052_b502_85d9992415c5.slice/crio-f860b8284190207fbcb7f6dc7d91f0090298192020078a29a6cab9dbe607a24b WatchSource:0}: Error finding container f860b8284190207fbcb7f6dc7d91f0090298192020078a29a6cab9dbe607a24b: Status 404 returned error can't find the container with id f860b8284190207fbcb7f6dc7d91f0090298192020078a29a6cab9dbe607a24b Mar 13 09:31:14 crc kubenswrapper[4841]: I0313 09:31:14.196817 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0830-account-create-update-8wtfg"] Mar 13 09:31:14 crc kubenswrapper[4841]: I0313 09:31:14.209293 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3885-account-create-update-5rctz"] Mar 13 09:31:14 crc kubenswrapper[4841]: W0313 09:31:14.212350 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54329c28_2ae6_4b02_8b91_b182ef4e0e23.slice/crio-ed90e815741567b6b483041c26148a6983041d3be2f0ad7da54ade7b4639f388 WatchSource:0}: Error finding container ed90e815741567b6b483041c26148a6983041d3be2f0ad7da54ade7b4639f388: Status 404 returned error can't find the container with id ed90e815741567b6b483041c26148a6983041d3be2f0ad7da54ade7b4639f388 Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.145403 4841 generic.go:334] "Generic (PLEG): container finished" podID="2448d2f1-6d19-4ef7-8df3-afab14941187" containerID="323d1d9cb0e760236c266a8610943ae89ab93b0f61ae6c3b085aaf26f1207cd3" exitCode=0 Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.145497 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wkjjs" event={"ID":"2448d2f1-6d19-4ef7-8df3-afab14941187","Type":"ContainerDied","Data":"323d1d9cb0e760236c266a8610943ae89ab93b0f61ae6c3b085aaf26f1207cd3"} Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.147204 4841 generic.go:334] "Generic (PLEG): container finished" podID="1dc06182-654b-4744-ac91-42013c901989" containerID="bd98d39e0f452e9b5fcd5ef818665846f27ea19723cfb7edb01e822b6f0a86ca" exitCode=0 Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.147359 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xx9s5" event={"ID":"1dc06182-654b-4744-ac91-42013c901989","Type":"ContainerDied","Data":"bd98d39e0f452e9b5fcd5ef818665846f27ea19723cfb7edb01e822b6f0a86ca"} Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.149890 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3885-account-create-update-5rctz" event={"ID":"54329c28-2ae6-4b02-8b91-b182ef4e0e23","Type":"ContainerStarted","Data":"e74b90c92a4bde3385e75d3e43913d62347d14774fa86a3c04c8a05e35485ee6"} Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.149959 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3885-account-create-update-5rctz" event={"ID":"54329c28-2ae6-4b02-8b91-b182ef4e0e23","Type":"ContainerStarted","Data":"ed90e815741567b6b483041c26148a6983041d3be2f0ad7da54ade7b4639f388"} Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.151726 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a114-account-create-update-cx7pv" event={"ID":"7b564622-1b7c-4e13-9126-e68d7b0ad6fa","Type":"ContainerStarted","Data":"4f78777355731824e1c5d7db6d09729a9d1bdf04886c05c557bb443dd3cfe84a"} Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.167619 4841 generic.go:334] "Generic (PLEG): container finished" podID="de1be705-392c-4454-81e1-2267d10d1535" containerID="9e8651cd6d1d6733375ececc81f1085226fc3c0f06520d49bf4c191ea2a8d585" exitCode=0 Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.167704 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-j4ks6" event={"ID":"de1be705-392c-4454-81e1-2267d10d1535","Type":"ContainerDied","Data":"9e8651cd6d1d6733375ececc81f1085226fc3c0f06520d49bf4c191ea2a8d585"} Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.170095 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0830-account-create-update-8wtfg" event={"ID":"9834c195-9fa4-4052-b502-85d9992415c5","Type":"ContainerStarted","Data":"cf51ced3ebcca47dcbef8caa14a2f73160682bb0649309cf74db00642b784378"} Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.170415 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0830-account-create-update-8wtfg" event={"ID":"9834c195-9fa4-4052-b502-85d9992415c5","Type":"ContainerStarted","Data":"f860b8284190207fbcb7f6dc7d91f0090298192020078a29a6cab9dbe607a24b"} Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.171611 4841 generic.go:334] "Generic (PLEG): container finished" podID="ca7a566b-2ed4-4044-a243-2074a5dcad72" containerID="92344e5d03f12c8c0c21d372021c69ea83ee6352a6ff02242725e31c3a6e08f3" exitCode=0 Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.171657 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1811-account-create-update-cx59z" event={"ID":"ca7a566b-2ed4-4044-a243-2074a5dcad72","Type":"ContainerDied","Data":"92344e5d03f12c8c0c21d372021c69ea83ee6352a6ff02242725e31c3a6e08f3"} Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.173604 4841 generic.go:334] "Generic (PLEG): container finished" podID="3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8" containerID="d56b04b6f31f1967b9c274e0e5556e0c3845582d3fdda5a4c50fa9157ff947a3" exitCode=0 Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.173645 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-cd2cd" event={"ID":"3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8","Type":"ContainerDied","Data":"d56b04b6f31f1967b9c274e0e5556e0c3845582d3fdda5a4c50fa9157ff947a3"} Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.185532 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-a114-account-create-update-cx7pv" podStartSLOduration=3.185512986 podStartE2EDuration="3.185512986s" podCreationTimestamp="2026-03-13 09:31:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:31:15.178906593 +0000 UTC m=+1157.908806784" watchObservedRunningTime="2026-03-13 09:31:15.185512986 +0000 UTC m=+1157.915413177" Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.194552 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-3885-account-create-update-5rctz" podStartSLOduration=2.194529112 podStartE2EDuration="2.194529112s" podCreationTimestamp="2026-03-13 09:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:31:15.193606333 +0000 UTC m=+1157.923506524" watchObservedRunningTime="2026-03-13 09:31:15.194529112 +0000 UTC m=+1157.924429303" Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.264762 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-0830-account-create-update-8wtfg" podStartSLOduration=3.264726797 podStartE2EDuration="3.264726797s" podCreationTimestamp="2026-03-13 09:31:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:31:15.247599814 +0000 UTC m=+1157.977500005" watchObservedRunningTime="2026-03-13 09:31:15.264726797 +0000 UTC m=+1157.994626988" Mar 13 09:31:15 crc kubenswrapper[4841]: I0313 09:31:15.280512 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bqlfl-config-5rlj2"] Mar 13 09:31:15 crc kubenswrapper[4841]: W0313 09:31:15.289440 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdb47fc9_696c_40c7_a824_c3f9dbc3b135.slice/crio-8f5163b636469af76f016e332afe974b915323f711329f395ab4aa8dff2545de WatchSource:0}: Error finding container 8f5163b636469af76f016e332afe974b915323f711329f395ab4aa8dff2545de: Status 404 returned error can't find the container with id 8f5163b636469af76f016e332afe974b915323f711329f395ab4aa8dff2545de Mar 13 09:31:16 crc kubenswrapper[4841]: I0313 09:31:16.189518 4841 generic.go:334] "Generic (PLEG): container finished" podID="7b564622-1b7c-4e13-9126-e68d7b0ad6fa" containerID="4f78777355731824e1c5d7db6d09729a9d1bdf04886c05c557bb443dd3cfe84a" exitCode=0 Mar 13 09:31:16 crc kubenswrapper[4841]: I0313 09:31:16.189553 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a114-account-create-update-cx7pv" event={"ID":"7b564622-1b7c-4e13-9126-e68d7b0ad6fa","Type":"ContainerDied","Data":"4f78777355731824e1c5d7db6d09729a9d1bdf04886c05c557bb443dd3cfe84a"} Mar 13 09:31:16 crc kubenswrapper[4841]: I0313 09:31:16.196021 4841 generic.go:334] "Generic (PLEG): container finished" podID="9834c195-9fa4-4052-b502-85d9992415c5" containerID="cf51ced3ebcca47dcbef8caa14a2f73160682bb0649309cf74db00642b784378" exitCode=0 Mar 13 09:31:16 crc kubenswrapper[4841]: I0313 09:31:16.196077 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0830-account-create-update-8wtfg" event={"ID":"9834c195-9fa4-4052-b502-85d9992415c5","Type":"ContainerDied","Data":"cf51ced3ebcca47dcbef8caa14a2f73160682bb0649309cf74db00642b784378"} Mar 13 09:31:16 crc kubenswrapper[4841]: I0313 09:31:16.201204 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64f62ec7-7a91-458c-86cb-7658544e4a51","Type":"ContainerStarted","Data":"48e347f46a9230bbe9afee13c4ce0864cd358e1ac611b3f0022fc3905b24524e"} Mar 13 09:31:16 crc kubenswrapper[4841]: I0313 09:31:16.201245 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64f62ec7-7a91-458c-86cb-7658544e4a51","Type":"ContainerStarted","Data":"53c4ed93afb11f3a7a87074f6a5e13be2d4cb5541105e886e1132fd08d9d92f7"} Mar 13 09:31:16 crc kubenswrapper[4841]: I0313 09:31:16.201256 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64f62ec7-7a91-458c-86cb-7658544e4a51","Type":"ContainerStarted","Data":"88c58e99c45cb4f8c838108d3e44f0d03685c773aa332d53b616d6cb685b7322"} Mar 13 09:31:16 crc kubenswrapper[4841]: I0313 09:31:16.203222 4841 generic.go:334] "Generic (PLEG): container finished" podID="fdb47fc9-696c-40c7-a824-c3f9dbc3b135" containerID="db01b472907116df9f6b9a6ef92c23f89714edd78737bae1dcc5ecaba5c9325e" exitCode=0 Mar 13 09:31:16 crc kubenswrapper[4841]: I0313 09:31:16.203580 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bqlfl-config-5rlj2" event={"ID":"fdb47fc9-696c-40c7-a824-c3f9dbc3b135","Type":"ContainerDied","Data":"db01b472907116df9f6b9a6ef92c23f89714edd78737bae1dcc5ecaba5c9325e"} Mar 13 09:31:16 crc kubenswrapper[4841]: I0313 09:31:16.203608 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bqlfl-config-5rlj2" event={"ID":"fdb47fc9-696c-40c7-a824-c3f9dbc3b135","Type":"ContainerStarted","Data":"8f5163b636469af76f016e332afe974b915323f711329f395ab4aa8dff2545de"} Mar 13 09:31:16 crc kubenswrapper[4841]: I0313 09:31:16.212893 4841 generic.go:334] "Generic (PLEG): container finished" podID="54329c28-2ae6-4b02-8b91-b182ef4e0e23" containerID="e74b90c92a4bde3385e75d3e43913d62347d14774fa86a3c04c8a05e35485ee6" exitCode=0 Mar 13 09:31:16 crc kubenswrapper[4841]: I0313 09:31:16.213170 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3885-account-create-update-5rctz" event={"ID":"54329c28-2ae6-4b02-8b91-b182ef4e0e23","Type":"ContainerDied","Data":"e74b90c92a4bde3385e75d3e43913d62347d14774fa86a3c04c8a05e35485ee6"} Mar 13 09:31:17 crc kubenswrapper[4841]: I0313 09:31:17.249301 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64f62ec7-7a91-458c-86cb-7658544e4a51","Type":"ContainerStarted","Data":"2896bf4b9ed84638f3b25f104f95865e67ea341ca8dac30b92d4b9111cb7e754"} Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.026149 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-cd2cd" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.032821 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wkjjs" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.067342 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1811-account-create-update-cx59z" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.076817 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-j4ks6" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.084597 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xx9s5" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.096350 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3885-account-create-update-5rctz" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.110736 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a114-account-create-update-cx7pv" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.122311 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0830-account-create-update-8wtfg" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.139171 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.163523 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2448d2f1-6d19-4ef7-8df3-afab14941187-operator-scripts\") pod \"2448d2f1-6d19-4ef7-8df3-afab14941187\" (UID: \"2448d2f1-6d19-4ef7-8df3-afab14941187\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.163745 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgqnt\" (UniqueName: \"kubernetes.io/projected/3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8-kube-api-access-qgqnt\") pod \"3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8\" (UID: \"3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.163789 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8-operator-scripts\") pod \"3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8\" (UID: \"3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.163846 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de1be705-392c-4454-81e1-2267d10d1535-operator-scripts\") pod \"de1be705-392c-4454-81e1-2267d10d1535\" (UID: \"de1be705-392c-4454-81e1-2267d10d1535\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.163872 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca7a566b-2ed4-4044-a243-2074a5dcad72-operator-scripts\") pod \"ca7a566b-2ed4-4044-a243-2074a5dcad72\" (UID: \"ca7a566b-2ed4-4044-a243-2074a5dcad72\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.163908 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdx9t\" (UniqueName: \"kubernetes.io/projected/54329c28-2ae6-4b02-8b91-b182ef4e0e23-kube-api-access-gdx9t\") pod \"54329c28-2ae6-4b02-8b91-b182ef4e0e23\" (UID: \"54329c28-2ae6-4b02-8b91-b182ef4e0e23\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.163939 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dc06182-654b-4744-ac91-42013c901989-operator-scripts\") pod \"1dc06182-654b-4744-ac91-42013c901989\" (UID: \"1dc06182-654b-4744-ac91-42013c901989\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.163993 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d484\" (UniqueName: \"kubernetes.io/projected/de1be705-392c-4454-81e1-2267d10d1535-kube-api-access-6d484\") pod \"de1be705-392c-4454-81e1-2267d10d1535\" (UID: \"de1be705-392c-4454-81e1-2267d10d1535\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.164032 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54329c28-2ae6-4b02-8b91-b182ef4e0e23-operator-scripts\") pod \"54329c28-2ae6-4b02-8b91-b182ef4e0e23\" (UID: \"54329c28-2ae6-4b02-8b91-b182ef4e0e23\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.164116 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgz85\" (UniqueName: \"kubernetes.io/projected/ca7a566b-2ed4-4044-a243-2074a5dcad72-kube-api-access-zgz85\") pod \"ca7a566b-2ed4-4044-a243-2074a5dcad72\" (UID: \"ca7a566b-2ed4-4044-a243-2074a5dcad72\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.164147 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkw6f\" (UniqueName: \"kubernetes.io/projected/2448d2f1-6d19-4ef7-8df3-afab14941187-kube-api-access-pkw6f\") pod \"2448d2f1-6d19-4ef7-8df3-afab14941187\" (UID: \"2448d2f1-6d19-4ef7-8df3-afab14941187\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.164181 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwcxt\" (UniqueName: \"kubernetes.io/projected/1dc06182-654b-4744-ac91-42013c901989-kube-api-access-dwcxt\") pod \"1dc06182-654b-4744-ac91-42013c901989\" (UID: \"1dc06182-654b-4744-ac91-42013c901989\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.164887 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de1be705-392c-4454-81e1-2267d10d1535-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de1be705-392c-4454-81e1-2267d10d1535" (UID: "de1be705-392c-4454-81e1-2267d10d1535"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.165174 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2448d2f1-6d19-4ef7-8df3-afab14941187-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2448d2f1-6d19-4ef7-8df3-afab14941187" (UID: "2448d2f1-6d19-4ef7-8df3-afab14941187"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.165644 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2448d2f1-6d19-4ef7-8df3-afab14941187-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.165672 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de1be705-392c-4454-81e1-2267d10d1535-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.169488 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8-kube-api-access-qgqnt" (OuterVolumeSpecName: "kube-api-access-qgqnt") pod "3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8" (UID: "3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8"). InnerVolumeSpecName "kube-api-access-qgqnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.169677 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8" (UID: "3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.169981 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54329c28-2ae6-4b02-8b91-b182ef4e0e23-kube-api-access-gdx9t" (OuterVolumeSpecName: "kube-api-access-gdx9t") pod "54329c28-2ae6-4b02-8b91-b182ef4e0e23" (UID: "54329c28-2ae6-4b02-8b91-b182ef4e0e23"). InnerVolumeSpecName "kube-api-access-gdx9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.170045 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca7a566b-2ed4-4044-a243-2074a5dcad72-kube-api-access-zgz85" (OuterVolumeSpecName: "kube-api-access-zgz85") pod "ca7a566b-2ed4-4044-a243-2074a5dcad72" (UID: "ca7a566b-2ed4-4044-a243-2074a5dcad72"). InnerVolumeSpecName "kube-api-access-zgz85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.170121 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de1be705-392c-4454-81e1-2267d10d1535-kube-api-access-6d484" (OuterVolumeSpecName: "kube-api-access-6d484") pod "de1be705-392c-4454-81e1-2267d10d1535" (UID: "de1be705-392c-4454-81e1-2267d10d1535"). InnerVolumeSpecName "kube-api-access-6d484". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.171278 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54329c28-2ae6-4b02-8b91-b182ef4e0e23-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54329c28-2ae6-4b02-8b91-b182ef4e0e23" (UID: "54329c28-2ae6-4b02-8b91-b182ef4e0e23"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.171810 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dc06182-654b-4744-ac91-42013c901989-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1dc06182-654b-4744-ac91-42013c901989" (UID: "1dc06182-654b-4744-ac91-42013c901989"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.172014 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca7a566b-2ed4-4044-a243-2074a5dcad72-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca7a566b-2ed4-4044-a243-2074a5dcad72" (UID: "ca7a566b-2ed4-4044-a243-2074a5dcad72"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.174565 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2448d2f1-6d19-4ef7-8df3-afab14941187-kube-api-access-pkw6f" (OuterVolumeSpecName: "kube-api-access-pkw6f") pod "2448d2f1-6d19-4ef7-8df3-afab14941187" (UID: "2448d2f1-6d19-4ef7-8df3-afab14941187"). InnerVolumeSpecName "kube-api-access-pkw6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.193118 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dc06182-654b-4744-ac91-42013c901989-kube-api-access-dwcxt" (OuterVolumeSpecName: "kube-api-access-dwcxt") pod "1dc06182-654b-4744-ac91-42013c901989" (UID: "1dc06182-654b-4744-ac91-42013c901989"). InnerVolumeSpecName "kube-api-access-dwcxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.266230 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-592cb\" (UniqueName: \"kubernetes.io/projected/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-kube-api-access-592cb\") pod \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.266312 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-var-run-ovn\") pod \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.266374 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqsmw\" (UniqueName: \"kubernetes.io/projected/9834c195-9fa4-4052-b502-85d9992415c5-kube-api-access-qqsmw\") pod \"9834c195-9fa4-4052-b502-85d9992415c5\" (UID: \"9834c195-9fa4-4052-b502-85d9992415c5\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.266402 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-var-log-ovn\") pod \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.266465 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-scripts\") pod \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.266534 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft9hj\" (UniqueName: \"kubernetes.io/projected/7b564622-1b7c-4e13-9126-e68d7b0ad6fa-kube-api-access-ft9hj\") pod \"7b564622-1b7c-4e13-9126-e68d7b0ad6fa\" (UID: \"7b564622-1b7c-4e13-9126-e68d7b0ad6fa\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.266554 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-var-run\") pod \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.266602 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b564622-1b7c-4e13-9126-e68d7b0ad6fa-operator-scripts\") pod \"7b564622-1b7c-4e13-9126-e68d7b0ad6fa\" (UID: \"7b564622-1b7c-4e13-9126-e68d7b0ad6fa\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.266655 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "fdb47fc9-696c-40c7-a824-c3f9dbc3b135" (UID: "fdb47fc9-696c-40c7-a824-c3f9dbc3b135"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.266660 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-additional-scripts\") pod \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\" (UID: \"fdb47fc9-696c-40c7-a824-c3f9dbc3b135\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.266710 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9834c195-9fa4-4052-b502-85d9992415c5-operator-scripts\") pod \"9834c195-9fa4-4052-b502-85d9992415c5\" (UID: \"9834c195-9fa4-4052-b502-85d9992415c5\") " Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.267096 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdx9t\" (UniqueName: \"kubernetes.io/projected/54329c28-2ae6-4b02-8b91-b182ef4e0e23-kube-api-access-gdx9t\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.267127 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dc06182-654b-4744-ac91-42013c901989-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.267137 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d484\" (UniqueName: \"kubernetes.io/projected/de1be705-392c-4454-81e1-2267d10d1535-kube-api-access-6d484\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.267147 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54329c28-2ae6-4b02-8b91-b182ef4e0e23-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.267156 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgz85\" (UniqueName: \"kubernetes.io/projected/ca7a566b-2ed4-4044-a243-2074a5dcad72-kube-api-access-zgz85\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.267164 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkw6f\" (UniqueName: \"kubernetes.io/projected/2448d2f1-6d19-4ef7-8df3-afab14941187-kube-api-access-pkw6f\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.267173 4841 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.267182 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwcxt\" (UniqueName: \"kubernetes.io/projected/1dc06182-654b-4744-ac91-42013c901989-kube-api-access-dwcxt\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.267190 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgqnt\" (UniqueName: \"kubernetes.io/projected/3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8-kube-api-access-qgqnt\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.267199 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.267208 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca7a566b-2ed4-4044-a243-2074a5dcad72-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.267217 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "fdb47fc9-696c-40c7-a824-c3f9dbc3b135" (UID: "fdb47fc9-696c-40c7-a824-c3f9dbc3b135"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.267489 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "fdb47fc9-696c-40c7-a824-c3f9dbc3b135" (UID: "fdb47fc9-696c-40c7-a824-c3f9dbc3b135"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.267525 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9834c195-9fa4-4052-b502-85d9992415c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9834c195-9fa4-4052-b502-85d9992415c5" (UID: "9834c195-9fa4-4052-b502-85d9992415c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.268013 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-var-run" (OuterVolumeSpecName: "var-run") pod "fdb47fc9-696c-40c7-a824-c3f9dbc3b135" (UID: "fdb47fc9-696c-40c7-a824-c3f9dbc3b135"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.268459 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b564622-1b7c-4e13-9126-e68d7b0ad6fa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7b564622-1b7c-4e13-9126-e68d7b0ad6fa" (UID: "7b564622-1b7c-4e13-9126-e68d7b0ad6fa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.268849 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-scripts" (OuterVolumeSpecName: "scripts") pod "fdb47fc9-696c-40c7-a824-c3f9dbc3b135" (UID: "fdb47fc9-696c-40c7-a824-c3f9dbc3b135"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.270190 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-kube-api-access-592cb" (OuterVolumeSpecName: "kube-api-access-592cb") pod "fdb47fc9-696c-40c7-a824-c3f9dbc3b135" (UID: "fdb47fc9-696c-40c7-a824-c3f9dbc3b135"). InnerVolumeSpecName "kube-api-access-592cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.272484 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9834c195-9fa4-4052-b502-85d9992415c5-kube-api-access-qqsmw" (OuterVolumeSpecName: "kube-api-access-qqsmw") pod "9834c195-9fa4-4052-b502-85d9992415c5" (UID: "9834c195-9fa4-4052-b502-85d9992415c5"). InnerVolumeSpecName "kube-api-access-qqsmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.272496 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b564622-1b7c-4e13-9126-e68d7b0ad6fa-kube-api-access-ft9hj" (OuterVolumeSpecName: "kube-api-access-ft9hj") pod "7b564622-1b7c-4e13-9126-e68d7b0ad6fa" (UID: "7b564622-1b7c-4e13-9126-e68d7b0ad6fa"). InnerVolumeSpecName "kube-api-access-ft9hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.276411 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4kxn8" event={"ID":"4000c1ec-fd5e-4449-be32-cc39edbf5d10","Type":"ContainerStarted","Data":"6cc7bf9472252c5035d94042247477285d03bc0eef5eb67e353e7deaba1ac7af"} Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.277663 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a114-account-create-update-cx7pv" event={"ID":"7b564622-1b7c-4e13-9126-e68d7b0ad6fa","Type":"ContainerDied","Data":"a3a424e41931e8019d71db683b8d0ef001fd2ba53258d630e509e80c9176d390"} Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.277690 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3a424e41931e8019d71db683b8d0ef001fd2ba53258d630e509e80c9176d390" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.277692 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a114-account-create-update-cx7pv" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.279217 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-j4ks6" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.279298 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-j4ks6" event={"ID":"de1be705-392c-4454-81e1-2267d10d1535","Type":"ContainerDied","Data":"a2dfefc9fc7620cd8a132c3f6cde52b3b378753ecb34872ed7cea34c14dc883e"} Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.279331 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2dfefc9fc7620cd8a132c3f6cde52b3b378753ecb34872ed7cea34c14dc883e" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.285322 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xx9s5" event={"ID":"1dc06182-654b-4744-ac91-42013c901989","Type":"ContainerDied","Data":"e186462e76a1d4a1d707c07a0e573d4716d59a7881679908984845c6489c73a3"} Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.285366 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e186462e76a1d4a1d707c07a0e573d4716d59a7881679908984845c6489c73a3" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.285470 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xx9s5" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.288401 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bqlfl-config-5rlj2" event={"ID":"fdb47fc9-696c-40c7-a824-c3f9dbc3b135","Type":"ContainerDied","Data":"8f5163b636469af76f016e332afe974b915323f711329f395ab4aa8dff2545de"} Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.288438 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f5163b636469af76f016e332afe974b915323f711329f395ab4aa8dff2545de" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.288439 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bqlfl-config-5rlj2" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.293761 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3885-account-create-update-5rctz" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.293760 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3885-account-create-update-5rctz" event={"ID":"54329c28-2ae6-4b02-8b91-b182ef4e0e23","Type":"ContainerDied","Data":"ed90e815741567b6b483041c26148a6983041d3be2f0ad7da54ade7b4639f388"} Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.293909 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed90e815741567b6b483041c26148a6983041d3be2f0ad7da54ade7b4639f388" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.295741 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0830-account-create-update-8wtfg" event={"ID":"9834c195-9fa4-4052-b502-85d9992415c5","Type":"ContainerDied","Data":"f860b8284190207fbcb7f6dc7d91f0090298192020078a29a6cab9dbe607a24b"} Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.295773 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f860b8284190207fbcb7f6dc7d91f0090298192020078a29a6cab9dbe607a24b" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.295802 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0830-account-create-update-8wtfg" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.297574 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1811-account-create-update-cx59z" event={"ID":"ca7a566b-2ed4-4044-a243-2074a5dcad72","Type":"ContainerDied","Data":"7cd12817f30013c0599b589ba2736ce032f44e3fee487aff81c1d6d126cb2dc9"} Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.297604 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cd12817f30013c0599b589ba2736ce032f44e3fee487aff81c1d6d126cb2dc9" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.297659 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1811-account-create-update-cx59z" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.299874 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-4kxn8" podStartSLOduration=2.285886277 podStartE2EDuration="8.299853353s" podCreationTimestamp="2026-03-13 09:31:12 +0000 UTC" firstStartedPulling="2026-03-13 09:31:13.869897822 +0000 UTC m=+1156.599798013" lastFinishedPulling="2026-03-13 09:31:19.883864898 +0000 UTC m=+1162.613765089" observedRunningTime="2026-03-13 09:31:20.290380444 +0000 UTC m=+1163.020280635" watchObservedRunningTime="2026-03-13 09:31:20.299853353 +0000 UTC m=+1163.029753544" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.300360 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-cd2cd" event={"ID":"3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8","Type":"ContainerDied","Data":"96560309e0e028a78b63423c82fc6bd091d546b9e5798d9002054d29f0cc2b58"} Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.300407 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96560309e0e028a78b63423c82fc6bd091d546b9e5798d9002054d29f0cc2b58" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.300432 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-cd2cd" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.303032 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wkjjs" event={"ID":"2448d2f1-6d19-4ef7-8df3-afab14941187","Type":"ContainerDied","Data":"ac63a773454d3c887a393ce5baa9ae804518c086a85d021e5f86713cd8619b88"} Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.303051 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac63a773454d3c887a393ce5baa9ae804518c086a85d021e5f86713cd8619b88" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.303193 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wkjjs" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.368595 4841 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.368621 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9834c195-9fa4-4052-b502-85d9992415c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.368632 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-592cb\" (UniqueName: \"kubernetes.io/projected/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-kube-api-access-592cb\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.368643 4841 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.368653 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqsmw\" (UniqueName: \"kubernetes.io/projected/9834c195-9fa4-4052-b502-85d9992415c5-kube-api-access-qqsmw\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.368661 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.368672 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft9hj\" (UniqueName: \"kubernetes.io/projected/7b564622-1b7c-4e13-9126-e68d7b0ad6fa-kube-api-access-ft9hj\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.368682 4841 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdb47fc9-696c-40c7-a824-c3f9dbc3b135-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:20 crc kubenswrapper[4841]: I0313 09:31:20.368690 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b564622-1b7c-4e13-9126-e68d7b0ad6fa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:21 crc kubenswrapper[4841]: I0313 09:31:21.242428 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-bqlfl-config-5rlj2"] Mar 13 09:31:21 crc kubenswrapper[4841]: I0313 09:31:21.250919 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-bqlfl-config-5rlj2"] Mar 13 09:31:21 crc kubenswrapper[4841]: I0313 09:31:21.314291 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64f62ec7-7a91-458c-86cb-7658544e4a51","Type":"ContainerStarted","Data":"b14abce4a82072e13225d96ed5c2037028f9a01ab300d9a2fa9e2a3fc0f8748f"} Mar 13 09:31:21 crc kubenswrapper[4841]: I0313 09:31:21.314356 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64f62ec7-7a91-458c-86cb-7658544e4a51","Type":"ContainerStarted","Data":"97fcc99c94b1a45f73228cc8b9da76006f32c40c699da36a5b5bd5549874fdbf"} Mar 13 09:31:21 crc kubenswrapper[4841]: I0313 09:31:21.314370 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64f62ec7-7a91-458c-86cb-7658544e4a51","Type":"ContainerStarted","Data":"2ea08e4298ab196f7d82eab6eea0f68bc02d9652201e1cc6ee2562f5c621ef73"} Mar 13 09:31:21 crc kubenswrapper[4841]: I0313 09:31:21.314380 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64f62ec7-7a91-458c-86cb-7658544e4a51","Type":"ContainerStarted","Data":"181a95e00d1d2c36510d46f0fcf8b22804971c78bf03858ceb13aefedffcd790"} Mar 13 09:31:22 crc kubenswrapper[4841]: I0313 09:31:22.019439 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdb47fc9-696c-40c7-a824-c3f9dbc3b135" path="/var/lib/kubelet/pods/fdb47fc9-696c-40c7-a824-c3f9dbc3b135/volumes" Mar 13 09:31:22 crc kubenswrapper[4841]: I0313 09:31:22.332481 4841 generic.go:334] "Generic (PLEG): container finished" podID="0ef41142-9432-4e66-9008-a3c1ff35e9a8" containerID="bc99b53eeb38394454c318b0a27e8e101d60e7d5e85125e266c1b816cf338bca" exitCode=0 Mar 13 09:31:22 crc kubenswrapper[4841]: I0313 09:31:22.332616 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b95pq" event={"ID":"0ef41142-9432-4e66-9008-a3c1ff35e9a8","Type":"ContainerDied","Data":"bc99b53eeb38394454c318b0a27e8e101d60e7d5e85125e266c1b816cf338bca"} Mar 13 09:31:22 crc kubenswrapper[4841]: I0313 09:31:22.343018 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64f62ec7-7a91-458c-86cb-7658544e4a51","Type":"ContainerStarted","Data":"92e69fcbaf1e1c6163fd0dcaa93be8364c54d0c0ed829eb13f290cacecabf058"} Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.356592 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64f62ec7-7a91-458c-86cb-7658544e4a51","Type":"ContainerStarted","Data":"b58be038f4a9c169cf5d2b9dadae41b609011b5f0336e82afeda3e51f65a7acd"} Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.357227 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64f62ec7-7a91-458c-86cb-7658544e4a51","Type":"ContainerStarted","Data":"356450216bcb8b523d6a40a2b154b4cd223a77d73b81ab9d7fb5db7c8dc7f144"} Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.663676 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.74058463 podStartE2EDuration="47.663658723s" podCreationTimestamp="2026-03-13 09:30:36 +0000 UTC" firstStartedPulling="2026-03-13 09:31:10.599609121 +0000 UTC m=+1153.329509322" lastFinishedPulling="2026-03-13 09:31:20.522683214 +0000 UTC m=+1163.252583415" observedRunningTime="2026-03-13 09:31:23.408809893 +0000 UTC m=+1166.138710084" watchObservedRunningTime="2026-03-13 09:31:23.663658723 +0000 UTC m=+1166.393558914" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.669356 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-l6bx5"] Mar 13 09:31:23 crc kubenswrapper[4841]: E0313 09:31:23.669651 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54329c28-2ae6-4b02-8b91-b182ef4e0e23" containerName="mariadb-account-create-update" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.669663 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="54329c28-2ae6-4b02-8b91-b182ef4e0e23" containerName="mariadb-account-create-update" Mar 13 09:31:23 crc kubenswrapper[4841]: E0313 09:31:23.669673 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc06182-654b-4744-ac91-42013c901989" containerName="mariadb-database-create" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.669680 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc06182-654b-4744-ac91-42013c901989" containerName="mariadb-database-create" Mar 13 09:31:23 crc kubenswrapper[4841]: E0313 09:31:23.669692 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2448d2f1-6d19-4ef7-8df3-afab14941187" containerName="mariadb-database-create" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.669699 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2448d2f1-6d19-4ef7-8df3-afab14941187" containerName="mariadb-database-create" Mar 13 09:31:23 crc kubenswrapper[4841]: E0313 09:31:23.669712 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca7a566b-2ed4-4044-a243-2074a5dcad72" containerName="mariadb-account-create-update" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.669719 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca7a566b-2ed4-4044-a243-2074a5dcad72" containerName="mariadb-account-create-update" Mar 13 09:31:23 crc kubenswrapper[4841]: E0313 09:31:23.669732 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8" containerName="mariadb-database-create" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.669738 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8" containerName="mariadb-database-create" Mar 13 09:31:23 crc kubenswrapper[4841]: E0313 09:31:23.669750 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9834c195-9fa4-4052-b502-85d9992415c5" containerName="mariadb-account-create-update" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.669756 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9834c195-9fa4-4052-b502-85d9992415c5" containerName="mariadb-account-create-update" Mar 13 09:31:23 crc kubenswrapper[4841]: E0313 09:31:23.669766 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb47fc9-696c-40c7-a824-c3f9dbc3b135" containerName="ovn-config" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.669772 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb47fc9-696c-40c7-a824-c3f9dbc3b135" containerName="ovn-config" Mar 13 09:31:23 crc kubenswrapper[4841]: E0313 09:31:23.669780 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1be705-392c-4454-81e1-2267d10d1535" containerName="mariadb-database-create" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.669785 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1be705-392c-4454-81e1-2267d10d1535" containerName="mariadb-database-create" Mar 13 09:31:23 crc kubenswrapper[4841]: E0313 09:31:23.669795 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b564622-1b7c-4e13-9126-e68d7b0ad6fa" containerName="mariadb-account-create-update" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.669800 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b564622-1b7c-4e13-9126-e68d7b0ad6fa" containerName="mariadb-account-create-update" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.669935 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca7a566b-2ed4-4044-a243-2074a5dcad72" containerName="mariadb-account-create-update" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.669946 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8" containerName="mariadb-database-create" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.669957 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b564622-1b7c-4e13-9126-e68d7b0ad6fa" containerName="mariadb-account-create-update" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.669967 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc06182-654b-4744-ac91-42013c901989" containerName="mariadb-database-create" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.669975 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="54329c28-2ae6-4b02-8b91-b182ef4e0e23" containerName="mariadb-account-create-update" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.669983 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdb47fc9-696c-40c7-a824-c3f9dbc3b135" containerName="ovn-config" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.669991 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="2448d2f1-6d19-4ef7-8df3-afab14941187" containerName="mariadb-database-create" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.670000 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9834c195-9fa4-4052-b502-85d9992415c5" containerName="mariadb-account-create-update" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.670012 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="de1be705-392c-4454-81e1-2267d10d1535" containerName="mariadb-database-create" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.672506 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.674788 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.698675 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-l6bx5"] Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.731501 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-l6bx5\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.731575 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-dns-svc\") pod \"dnsmasq-dns-764c5664d7-l6bx5\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.731615 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-l6bx5\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.731739 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-l6bx5\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.731901 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-config\") pod \"dnsmasq-dns-764c5664d7-l6bx5\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.731971 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhwxq\" (UniqueName: \"kubernetes.io/projected/075e48dc-cde1-482b-9c40-db92967af922-kube-api-access-hhwxq\") pod \"dnsmasq-dns-764c5664d7-l6bx5\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.817383 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b95pq" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.833842 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-l6bx5\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.833919 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-dns-svc\") pod \"dnsmasq-dns-764c5664d7-l6bx5\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.833953 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-l6bx5\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.833984 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-l6bx5\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.834032 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-config\") pod \"dnsmasq-dns-764c5664d7-l6bx5\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.834064 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhwxq\" (UniqueName: \"kubernetes.io/projected/075e48dc-cde1-482b-9c40-db92967af922-kube-api-access-hhwxq\") pod \"dnsmasq-dns-764c5664d7-l6bx5\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.834906 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-l6bx5\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.835076 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-dns-svc\") pod \"dnsmasq-dns-764c5664d7-l6bx5\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.835240 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-l6bx5\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.835241 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-config\") pod \"dnsmasq-dns-764c5664d7-l6bx5\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.835863 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-l6bx5\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.858006 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhwxq\" (UniqueName: \"kubernetes.io/projected/075e48dc-cde1-482b-9c40-db92967af922-kube-api-access-hhwxq\") pod \"dnsmasq-dns-764c5664d7-l6bx5\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.935177 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pq6s\" (UniqueName: \"kubernetes.io/projected/0ef41142-9432-4e66-9008-a3c1ff35e9a8-kube-api-access-4pq6s\") pod \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\" (UID: \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\") " Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.935327 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef41142-9432-4e66-9008-a3c1ff35e9a8-config-data\") pod \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\" (UID: \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\") " Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.935448 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ef41142-9432-4e66-9008-a3c1ff35e9a8-combined-ca-bundle\") pod \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\" (UID: \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\") " Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.935509 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ef41142-9432-4e66-9008-a3c1ff35e9a8-db-sync-config-data\") pod \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\" (UID: \"0ef41142-9432-4e66-9008-a3c1ff35e9a8\") " Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.948701 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ef41142-9432-4e66-9008-a3c1ff35e9a8-kube-api-access-4pq6s" (OuterVolumeSpecName: "kube-api-access-4pq6s") pod "0ef41142-9432-4e66-9008-a3c1ff35e9a8" (UID: "0ef41142-9432-4e66-9008-a3c1ff35e9a8"). InnerVolumeSpecName "kube-api-access-4pq6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.948894 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ef41142-9432-4e66-9008-a3c1ff35e9a8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0ef41142-9432-4e66-9008-a3c1ff35e9a8" (UID: "0ef41142-9432-4e66-9008-a3c1ff35e9a8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.968408 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ef41142-9432-4e66-9008-a3c1ff35e9a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ef41142-9432-4e66-9008-a3c1ff35e9a8" (UID: "0ef41142-9432-4e66-9008-a3c1ff35e9a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:23 crc kubenswrapper[4841]: I0313 09:31:23.994242 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ef41142-9432-4e66-9008-a3c1ff35e9a8-config-data" (OuterVolumeSpecName: "config-data") pod "0ef41142-9432-4e66-9008-a3c1ff35e9a8" (UID: "0ef41142-9432-4e66-9008-a3c1ff35e9a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.000136 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.038581 4841 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ef41142-9432-4e66-9008-a3c1ff35e9a8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.038627 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pq6s\" (UniqueName: \"kubernetes.io/projected/0ef41142-9432-4e66-9008-a3c1ff35e9a8-kube-api-access-4pq6s\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.038638 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef41142-9432-4e66-9008-a3c1ff35e9a8-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.038647 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ef41142-9432-4e66-9008-a3c1ff35e9a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.365820 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b95pq" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.365820 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b95pq" event={"ID":"0ef41142-9432-4e66-9008-a3c1ff35e9a8","Type":"ContainerDied","Data":"4814b6aa18d375627fddbb60174ecf7af831e69fa0e8f573cef4e504880f98f2"} Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.365904 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4814b6aa18d375627fddbb60174ecf7af831e69fa0e8f573cef4e504880f98f2" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.436877 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-l6bx5"] Mar 13 09:31:24 crc kubenswrapper[4841]: W0313 09:31:24.442841 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod075e48dc_cde1_482b_9c40_db92967af922.slice/crio-93546ac325f2620ac1607f8f470ade57d29e9c4fa3d04d3abe5e276d1f5965de WatchSource:0}: Error finding container 93546ac325f2620ac1607f8f470ade57d29e9c4fa3d04d3abe5e276d1f5965de: Status 404 returned error can't find the container with id 93546ac325f2620ac1607f8f470ade57d29e9c4fa3d04d3abe5e276d1f5965de Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.769173 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-l6bx5"] Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.801220 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-zxvqm"] Mar 13 09:31:24 crc kubenswrapper[4841]: E0313 09:31:24.801580 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ef41142-9432-4e66-9008-a3c1ff35e9a8" containerName="glance-db-sync" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.801597 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ef41142-9432-4e66-9008-a3c1ff35e9a8" containerName="glance-db-sync" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.801740 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ef41142-9432-4e66-9008-a3c1ff35e9a8" containerName="glance-db-sync" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.802509 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.834565 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-zxvqm"] Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.850212 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn6nh\" (UniqueName: \"kubernetes.io/projected/df9290ff-b191-4f7f-b013-d4bdd2be42db-kube-api-access-zn6nh\") pod \"dnsmasq-dns-74f6bcbc87-zxvqm\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.850304 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-zxvqm\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.850327 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-zxvqm\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.850358 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-zxvqm\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.850423 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-zxvqm\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.850446 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-config\") pod \"dnsmasq-dns-74f6bcbc87-zxvqm\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.952318 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn6nh\" (UniqueName: \"kubernetes.io/projected/df9290ff-b191-4f7f-b013-d4bdd2be42db-kube-api-access-zn6nh\") pod \"dnsmasq-dns-74f6bcbc87-zxvqm\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.952403 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-zxvqm\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.952424 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-zxvqm\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.952456 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-zxvqm\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.952489 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-zxvqm\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.952511 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-config\") pod \"dnsmasq-dns-74f6bcbc87-zxvqm\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.953567 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-zxvqm\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.953675 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-config\") pod \"dnsmasq-dns-74f6bcbc87-zxvqm\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.953678 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-zxvqm\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.953791 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-zxvqm\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.954259 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-zxvqm\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:24 crc kubenswrapper[4841]: I0313 09:31:24.970836 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn6nh\" (UniqueName: \"kubernetes.io/projected/df9290ff-b191-4f7f-b013-d4bdd2be42db-kube-api-access-zn6nh\") pod \"dnsmasq-dns-74f6bcbc87-zxvqm\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:25 crc kubenswrapper[4841]: I0313 09:31:25.153466 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:25 crc kubenswrapper[4841]: I0313 09:31:25.379513 4841 generic.go:334] "Generic (PLEG): container finished" podID="075e48dc-cde1-482b-9c40-db92967af922" containerID="1b5cdd1b927c405527b579d28b46020cadab89005285d5ae48d1305e24ac1c19" exitCode=0 Mar 13 09:31:25 crc kubenswrapper[4841]: I0313 09:31:25.379642 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" event={"ID":"075e48dc-cde1-482b-9c40-db92967af922","Type":"ContainerDied","Data":"1b5cdd1b927c405527b579d28b46020cadab89005285d5ae48d1305e24ac1c19"} Mar 13 09:31:25 crc kubenswrapper[4841]: I0313 09:31:25.380142 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" event={"ID":"075e48dc-cde1-482b-9c40-db92967af922","Type":"ContainerStarted","Data":"93546ac325f2620ac1607f8f470ade57d29e9c4fa3d04d3abe5e276d1f5965de"} Mar 13 09:31:25 crc kubenswrapper[4841]: I0313 09:31:25.382819 4841 generic.go:334] "Generic (PLEG): container finished" podID="4000c1ec-fd5e-4449-be32-cc39edbf5d10" containerID="6cc7bf9472252c5035d94042247477285d03bc0eef5eb67e353e7deaba1ac7af" exitCode=0 Mar 13 09:31:25 crc kubenswrapper[4841]: I0313 09:31:25.382859 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4kxn8" event={"ID":"4000c1ec-fd5e-4449-be32-cc39edbf5d10","Type":"ContainerDied","Data":"6cc7bf9472252c5035d94042247477285d03bc0eef5eb67e353e7deaba1ac7af"} Mar 13 09:31:25 crc kubenswrapper[4841]: E0313 09:31:25.557524 4841 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 13 09:31:25 crc kubenswrapper[4841]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/075e48dc-cde1-482b-9c40-db92967af922/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 13 09:31:25 crc kubenswrapper[4841]: > podSandboxID="93546ac325f2620ac1607f8f470ade57d29e9c4fa3d04d3abe5e276d1f5965de" Mar 13 09:31:25 crc kubenswrapper[4841]: E0313 09:31:25.558017 4841 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 09:31:25 crc kubenswrapper[4841]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n66chbbh56dh7fhfh68chf9hfdhbdh587h5b9h568h68fh77h5b5h559h577h687h574h5d5h584h8chd9hb4h66h566h545h699h564h568h66fhc9q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hhwxq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-764c5664d7-l6bx5_openstack(075e48dc-cde1-482b-9c40-db92967af922): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/075e48dc-cde1-482b-9c40-db92967af922/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 13 09:31:25 crc kubenswrapper[4841]: > logger="UnhandledError" Mar 13 09:31:25 crc kubenswrapper[4841]: E0313 09:31:25.559177 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/075e48dc-cde1-482b-9c40-db92967af922/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" podUID="075e48dc-cde1-482b-9c40-db92967af922" Mar 13 09:31:25 crc kubenswrapper[4841]: I0313 09:31:25.606702 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-zxvqm"] Mar 13 09:31:25 crc kubenswrapper[4841]: W0313 09:31:25.609442 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf9290ff_b191_4f7f_b013_d4bdd2be42db.slice/crio-bfe701c9c4ab4ae3e095c71be0859440cd7894744baecd5dfc868d314fc89486 WatchSource:0}: Error finding container bfe701c9c4ab4ae3e095c71be0859440cd7894744baecd5dfc868d314fc89486: Status 404 returned error can't find the container with id bfe701c9c4ab4ae3e095c71be0859440cd7894744baecd5dfc868d314fc89486 Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.404322 4841 generic.go:334] "Generic (PLEG): container finished" podID="df9290ff-b191-4f7f-b013-d4bdd2be42db" containerID="25ccf9465768e8d7df84692f3278c200307657facc2671b368e9cea6c3faf721" exitCode=0 Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.404393 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" event={"ID":"df9290ff-b191-4f7f-b013-d4bdd2be42db","Type":"ContainerDied","Data":"25ccf9465768e8d7df84692f3278c200307657facc2671b368e9cea6c3faf721"} Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.404684 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" event={"ID":"df9290ff-b191-4f7f-b013-d4bdd2be42db","Type":"ContainerStarted","Data":"bfe701c9c4ab4ae3e095c71be0859440cd7894744baecd5dfc868d314fc89486"} Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.670130 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.794482 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4kxn8" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.796005 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-ovsdbserver-sb\") pod \"075e48dc-cde1-482b-9c40-db92967af922\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.796040 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhwxq\" (UniqueName: \"kubernetes.io/projected/075e48dc-cde1-482b-9c40-db92967af922-kube-api-access-hhwxq\") pod \"075e48dc-cde1-482b-9c40-db92967af922\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.796112 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-dns-swift-storage-0\") pod \"075e48dc-cde1-482b-9c40-db92967af922\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.796132 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-dns-svc\") pod \"075e48dc-cde1-482b-9c40-db92967af922\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.796148 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-config\") pod \"075e48dc-cde1-482b-9c40-db92967af922\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.796199 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-ovsdbserver-nb\") pod \"075e48dc-cde1-482b-9c40-db92967af922\" (UID: \"075e48dc-cde1-482b-9c40-db92967af922\") " Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.802617 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/075e48dc-cde1-482b-9c40-db92967af922-kube-api-access-hhwxq" (OuterVolumeSpecName: "kube-api-access-hhwxq") pod "075e48dc-cde1-482b-9c40-db92967af922" (UID: "075e48dc-cde1-482b-9c40-db92967af922"). InnerVolumeSpecName "kube-api-access-hhwxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.848798 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "075e48dc-cde1-482b-9c40-db92967af922" (UID: "075e48dc-cde1-482b-9c40-db92967af922"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.860935 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "075e48dc-cde1-482b-9c40-db92967af922" (UID: "075e48dc-cde1-482b-9c40-db92967af922"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.862153 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-config" (OuterVolumeSpecName: "config") pod "075e48dc-cde1-482b-9c40-db92967af922" (UID: "075e48dc-cde1-482b-9c40-db92967af922"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.865279 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "075e48dc-cde1-482b-9c40-db92967af922" (UID: "075e48dc-cde1-482b-9c40-db92967af922"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.880050 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "075e48dc-cde1-482b-9c40-db92967af922" (UID: "075e48dc-cde1-482b-9c40-db92967af922"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.897588 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4000c1ec-fd5e-4449-be32-cc39edbf5d10-combined-ca-bundle\") pod \"4000c1ec-fd5e-4449-be32-cc39edbf5d10\" (UID: \"4000c1ec-fd5e-4449-be32-cc39edbf5d10\") " Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.897664 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvlfn\" (UniqueName: \"kubernetes.io/projected/4000c1ec-fd5e-4449-be32-cc39edbf5d10-kube-api-access-mvlfn\") pod \"4000c1ec-fd5e-4449-be32-cc39edbf5d10\" (UID: \"4000c1ec-fd5e-4449-be32-cc39edbf5d10\") " Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.897718 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4000c1ec-fd5e-4449-be32-cc39edbf5d10-config-data\") pod \"4000c1ec-fd5e-4449-be32-cc39edbf5d10\" (UID: \"4000c1ec-fd5e-4449-be32-cc39edbf5d10\") " Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.898158 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.898174 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhwxq\" (UniqueName: \"kubernetes.io/projected/075e48dc-cde1-482b-9c40-db92967af922-kube-api-access-hhwxq\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.898187 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.898209 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.898219 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.898227 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/075e48dc-cde1-482b-9c40-db92967af922-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.900555 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4000c1ec-fd5e-4449-be32-cc39edbf5d10-kube-api-access-mvlfn" (OuterVolumeSpecName: "kube-api-access-mvlfn") pod "4000c1ec-fd5e-4449-be32-cc39edbf5d10" (UID: "4000c1ec-fd5e-4449-be32-cc39edbf5d10"). InnerVolumeSpecName "kube-api-access-mvlfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.925544 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4000c1ec-fd5e-4449-be32-cc39edbf5d10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4000c1ec-fd5e-4449-be32-cc39edbf5d10" (UID: "4000c1ec-fd5e-4449-be32-cc39edbf5d10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.936448 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4000c1ec-fd5e-4449-be32-cc39edbf5d10-config-data" (OuterVolumeSpecName: "config-data") pod "4000c1ec-fd5e-4449-be32-cc39edbf5d10" (UID: "4000c1ec-fd5e-4449-be32-cc39edbf5d10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.999735 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4000c1ec-fd5e-4449-be32-cc39edbf5d10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.999768 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvlfn\" (UniqueName: \"kubernetes.io/projected/4000c1ec-fd5e-4449-be32-cc39edbf5d10-kube-api-access-mvlfn\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:26 crc kubenswrapper[4841]: I0313 09:31:26.999780 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4000c1ec-fd5e-4449-be32-cc39edbf5d10-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.413855 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" event={"ID":"df9290ff-b191-4f7f-b013-d4bdd2be42db","Type":"ContainerStarted","Data":"f3627f52358b56a53f859ed7cf48322d3e9335c57585dd4021ba3613ebe35b52"} Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.415622 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.415743 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-l6bx5" event={"ID":"075e48dc-cde1-482b-9c40-db92967af922","Type":"ContainerDied","Data":"93546ac325f2620ac1607f8f470ade57d29e9c4fa3d04d3abe5e276d1f5965de"} Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.415908 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.415942 4841 scope.go:117] "RemoveContainer" containerID="1b5cdd1b927c405527b579d28b46020cadab89005285d5ae48d1305e24ac1c19" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.419290 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4kxn8" event={"ID":"4000c1ec-fd5e-4449-be32-cc39edbf5d10","Type":"ContainerDied","Data":"5c25437b713c81c8143248a8a46cda18fbf295c32d14609ad990f79d36e02eec"} Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.419481 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c25437b713c81c8143248a8a46cda18fbf295c32d14609ad990f79d36e02eec" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.419934 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4kxn8" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.435893 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" podStartSLOduration=3.435878585 podStartE2EDuration="3.435878585s" podCreationTimestamp="2026-03-13 09:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:31:27.434735181 +0000 UTC m=+1170.164635372" watchObservedRunningTime="2026-03-13 09:31:27.435878585 +0000 UTC m=+1170.165778776" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.549326 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-l6bx5"] Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.564825 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-l6bx5"] Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.701669 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-zxvqm"] Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.708336 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bqgds"] Mar 13 09:31:27 crc kubenswrapper[4841]: E0313 09:31:27.708678 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="075e48dc-cde1-482b-9c40-db92967af922" containerName="init" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.708694 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="075e48dc-cde1-482b-9c40-db92967af922" containerName="init" Mar 13 09:31:27 crc kubenswrapper[4841]: E0313 09:31:27.708720 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4000c1ec-fd5e-4449-be32-cc39edbf5d10" containerName="keystone-db-sync" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.708728 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4000c1ec-fd5e-4449-be32-cc39edbf5d10" containerName="keystone-db-sync" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.708877 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="075e48dc-cde1-482b-9c40-db92967af922" containerName="init" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.708894 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4000c1ec-fd5e-4449-be32-cc39edbf5d10" containerName="keystone-db-sync" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.709392 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.711714 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.711718 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-88dmb" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.711924 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.712136 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.737828 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bqgds"] Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.745375 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.799788 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-s8v8c"] Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.801652 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.809950 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-s8v8c"] Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.812229 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-config-data\") pod \"keystone-bootstrap-bqgds\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.815256 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq8d9\" (UniqueName: \"kubernetes.io/projected/2198213d-5d6d-4438-a171-82745eb1f35f-kube-api-access-cq8d9\") pod \"keystone-bootstrap-bqgds\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.815913 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-scripts\") pod \"keystone-bootstrap-bqgds\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.816093 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-fernet-keys\") pod \"keystone-bootstrap-bqgds\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.816175 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-credential-keys\") pod \"keystone-bootstrap-bqgds\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.816239 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-combined-ca-bundle\") pod \"keystone-bootstrap-bqgds\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.849294 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-q6zfh"] Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.850505 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-q6zfh" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.853609 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.853804 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-zzq55" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.866786 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-q6zfh"] Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.917596 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-ppvgx"] Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.919038 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ppvgx" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.919167 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-s8v8c\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.919216 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-s8v8c\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.919246 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-dns-svc\") pod \"dnsmasq-dns-847c4cc679-s8v8c\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.919361 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-config-data\") pod \"keystone-bootstrap-bqgds\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.919403 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq8d9\" (UniqueName: \"kubernetes.io/projected/2198213d-5d6d-4438-a171-82745eb1f35f-kube-api-access-cq8d9\") pod \"keystone-bootstrap-bqgds\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.919469 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-scripts\") pod \"keystone-bootstrap-bqgds\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.919502 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkv4n\" (UniqueName: \"kubernetes.io/projected/0d0fe070-7383-427a-895a-f4116dd15ec3-kube-api-access-tkv4n\") pod \"dnsmasq-dns-847c4cc679-s8v8c\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.919538 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2038e7ba-1de4-49b4-95dd-b2f3cde7be45-config-data\") pod \"heat-db-sync-q6zfh\" (UID: \"2038e7ba-1de4-49b4-95dd-b2f3cde7be45\") " pod="openstack/heat-db-sync-q6zfh" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.919566 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gdks\" (UniqueName: \"kubernetes.io/projected/2038e7ba-1de4-49b4-95dd-b2f3cde7be45-kube-api-access-6gdks\") pod \"heat-db-sync-q6zfh\" (UID: \"2038e7ba-1de4-49b4-95dd-b2f3cde7be45\") " pod="openstack/heat-db-sync-q6zfh" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.919590 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-fernet-keys\") pod \"keystone-bootstrap-bqgds\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.919610 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-s8v8c\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.919639 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-credential-keys\") pod \"keystone-bootstrap-bqgds\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.919659 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-combined-ca-bundle\") pod \"keystone-bootstrap-bqgds\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.919699 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2038e7ba-1de4-49b4-95dd-b2f3cde7be45-combined-ca-bundle\") pod \"heat-db-sync-q6zfh\" (UID: \"2038e7ba-1de4-49b4-95dd-b2f3cde7be45\") " pod="openstack/heat-db-sync-q6zfh" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.919734 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-config\") pod \"dnsmasq-dns-847c4cc679-s8v8c\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.921676 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-ppvgx"] Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.925173 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-75sb5" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.925497 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.925751 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.927231 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-scripts\") pod \"keystone-bootstrap-bqgds\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.928399 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-combined-ca-bundle\") pod \"keystone-bootstrap-bqgds\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.933282 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-config-data\") pod \"keystone-bootstrap-bqgds\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.933850 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-fernet-keys\") pod \"keystone-bootstrap-bqgds\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.934647 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-credential-keys\") pod \"keystone-bootstrap-bqgds\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.957845 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq8d9\" (UniqueName: \"kubernetes.io/projected/2198213d-5d6d-4438-a171-82745eb1f35f-kube-api-access-cq8d9\") pod \"keystone-bootstrap-bqgds\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.976772 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.979123 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.986012 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 09:31:27 crc kubenswrapper[4841]: I0313 09:31:27.986307 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.021678 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-s8v8c\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.021734 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-s8v8c\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.021771 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-dns-svc\") pod \"dnsmasq-dns-847c4cc679-s8v8c\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.021924 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e627e4b-0a39-42af-be45-147ff230fd13-combined-ca-bundle\") pod \"neutron-db-sync-ppvgx\" (UID: \"2e627e4b-0a39-42af-be45-147ff230fd13\") " pod="openstack/neutron-db-sync-ppvgx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.021955 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkv4n\" (UniqueName: \"kubernetes.io/projected/0d0fe070-7383-427a-895a-f4116dd15ec3-kube-api-access-tkv4n\") pod \"dnsmasq-dns-847c4cc679-s8v8c\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.022010 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fx9c\" (UniqueName: \"kubernetes.io/projected/2e627e4b-0a39-42af-be45-147ff230fd13-kube-api-access-9fx9c\") pod \"neutron-db-sync-ppvgx\" (UID: \"2e627e4b-0a39-42af-be45-147ff230fd13\") " pod="openstack/neutron-db-sync-ppvgx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.022041 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2038e7ba-1de4-49b4-95dd-b2f3cde7be45-config-data\") pod \"heat-db-sync-q6zfh\" (UID: \"2038e7ba-1de4-49b4-95dd-b2f3cde7be45\") " pod="openstack/heat-db-sync-q6zfh" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.022086 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gdks\" (UniqueName: \"kubernetes.io/projected/2038e7ba-1de4-49b4-95dd-b2f3cde7be45-kube-api-access-6gdks\") pod \"heat-db-sync-q6zfh\" (UID: \"2038e7ba-1de4-49b4-95dd-b2f3cde7be45\") " pod="openstack/heat-db-sync-q6zfh" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.022121 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-s8v8c\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.022170 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2e627e4b-0a39-42af-be45-147ff230fd13-config\") pod \"neutron-db-sync-ppvgx\" (UID: \"2e627e4b-0a39-42af-be45-147ff230fd13\") " pod="openstack/neutron-db-sync-ppvgx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.022225 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2038e7ba-1de4-49b4-95dd-b2f3cde7be45-combined-ca-bundle\") pod \"heat-db-sync-q6zfh\" (UID: \"2038e7ba-1de4-49b4-95dd-b2f3cde7be45\") " pod="openstack/heat-db-sync-q6zfh" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.022285 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-config\") pod \"dnsmasq-dns-847c4cc679-s8v8c\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.022981 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-s8v8c\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.023119 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-config\") pod \"dnsmasq-dns-847c4cc679-s8v8c\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.023535 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-s8v8c\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.023792 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-dns-svc\") pod \"dnsmasq-dns-847c4cc679-s8v8c\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.024630 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-s8v8c\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.029599 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2038e7ba-1de4-49b4-95dd-b2f3cde7be45-combined-ca-bundle\") pod \"heat-db-sync-q6zfh\" (UID: \"2038e7ba-1de4-49b4-95dd-b2f3cde7be45\") " pod="openstack/heat-db-sync-q6zfh" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.039931 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2038e7ba-1de4-49b4-95dd-b2f3cde7be45-config-data\") pod \"heat-db-sync-q6zfh\" (UID: \"2038e7ba-1de4-49b4-95dd-b2f3cde7be45\") " pod="openstack/heat-db-sync-q6zfh" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.040748 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="075e48dc-cde1-482b-9c40-db92967af922" path="/var/lib/kubelet/pods/075e48dc-cde1-482b-9c40-db92967af922/volumes" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.041331 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.044018 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-9nnr7"] Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.045146 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.055590 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.055836 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-52rnh" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.055941 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.067015 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkv4n\" (UniqueName: \"kubernetes.io/projected/0d0fe070-7383-427a-895a-f4116dd15ec3-kube-api-access-tkv4n\") pod \"dnsmasq-dns-847c4cc679-s8v8c\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.074648 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.075544 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gdks\" (UniqueName: \"kubernetes.io/projected/2038e7ba-1de4-49b4-95dd-b2f3cde7be45-kube-api-access-6gdks\") pod \"heat-db-sync-q6zfh\" (UID: \"2038e7ba-1de4-49b4-95dd-b2f3cde7be45\") " pod="openstack/heat-db-sync-q6zfh" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.093312 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9nnr7"] Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.107237 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-gg7dx"] Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.108378 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gg7dx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.113798 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tg6zz" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.114137 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.123940 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fx9c\" (UniqueName: \"kubernetes.io/projected/2e627e4b-0a39-42af-be45-147ff230fd13-kube-api-access-9fx9c\") pod \"neutron-db-sync-ppvgx\" (UID: \"2e627e4b-0a39-42af-be45-147ff230fd13\") " pod="openstack/neutron-db-sync-ppvgx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.124306 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d602e6f-77e5-4496-b426-2c003dad63e4-etc-machine-id\") pod \"cinder-db-sync-9nnr7\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.124331 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-scripts\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.124398 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvqt9\" (UniqueName: \"kubernetes.io/projected/72841340-b4e1-4283-8eb0-10641fb61f62-kube-api-access-vvqt9\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.124431 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72841340-b4e1-4283-8eb0-10641fb61f62-log-httpd\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.124452 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2e627e4b-0a39-42af-be45-147ff230fd13-config\") pod \"neutron-db-sync-ppvgx\" (UID: \"2e627e4b-0a39-42af-be45-147ff230fd13\") " pod="openstack/neutron-db-sync-ppvgx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.124475 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-config-data\") pod \"cinder-db-sync-9nnr7\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.124508 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-config-data\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.124531 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhm7f\" (UniqueName: \"kubernetes.io/projected/5d602e6f-77e5-4496-b426-2c003dad63e4-kube-api-access-qhm7f\") pod \"cinder-db-sync-9nnr7\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.124548 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-scripts\") pod \"cinder-db-sync-9nnr7\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.124605 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-combined-ca-bundle\") pod \"cinder-db-sync-9nnr7\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.124645 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-db-sync-config-data\") pod \"cinder-db-sync-9nnr7\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.124675 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.124742 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72841340-b4e1-4283-8eb0-10641fb61f62-run-httpd\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.124768 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e627e4b-0a39-42af-be45-147ff230fd13-combined-ca-bundle\") pod \"neutron-db-sync-ppvgx\" (UID: \"2e627e4b-0a39-42af-be45-147ff230fd13\") " pod="openstack/neutron-db-sync-ppvgx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.124804 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.136219 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2e627e4b-0a39-42af-be45-147ff230fd13-config\") pod \"neutron-db-sync-ppvgx\" (UID: \"2e627e4b-0a39-42af-be45-147ff230fd13\") " pod="openstack/neutron-db-sync-ppvgx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.143121 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e627e4b-0a39-42af-be45-147ff230fd13-combined-ca-bundle\") pod \"neutron-db-sync-ppvgx\" (UID: \"2e627e4b-0a39-42af-be45-147ff230fd13\") " pod="openstack/neutron-db-sync-ppvgx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.152141 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fx9c\" (UniqueName: \"kubernetes.io/projected/2e627e4b-0a39-42af-be45-147ff230fd13-kube-api-access-9fx9c\") pod \"neutron-db-sync-ppvgx\" (UID: \"2e627e4b-0a39-42af-be45-147ff230fd13\") " pod="openstack/neutron-db-sync-ppvgx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.152210 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gg7dx"] Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.153617 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.153731 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-phwrx"] Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.155150 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-phwrx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.165805 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.166007 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.166124 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-nbnrq" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.185628 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-q6zfh" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.219650 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-phwrx"] Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226091 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb20acbd-2346-46cd-baba-089a6afed51b-scripts\") pod \"placement-db-sync-phwrx\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " pod="openstack/placement-db-sync-phwrx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226155 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72841340-b4e1-4283-8eb0-10641fb61f62-run-httpd\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226184 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226224 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-scripts\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226239 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d602e6f-77e5-4496-b426-2c003dad63e4-etc-machine-id\") pod \"cinder-db-sync-9nnr7\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226315 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzqbd\" (UniqueName: \"kubernetes.io/projected/cb20acbd-2346-46cd-baba-089a6afed51b-kube-api-access-tzqbd\") pod \"placement-db-sync-phwrx\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " pod="openstack/placement-db-sync-phwrx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226335 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvqt9\" (UniqueName: \"kubernetes.io/projected/72841340-b4e1-4283-8eb0-10641fb61f62-kube-api-access-vvqt9\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226353 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72841340-b4e1-4283-8eb0-10641fb61f62-log-httpd\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226395 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb20acbd-2346-46cd-baba-089a6afed51b-logs\") pod \"placement-db-sync-phwrx\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " pod="openstack/placement-db-sync-phwrx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226434 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-config-data\") pod \"cinder-db-sync-9nnr7\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226497 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-config-data\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226533 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhm7f\" (UniqueName: \"kubernetes.io/projected/5d602e6f-77e5-4496-b426-2c003dad63e4-kube-api-access-qhm7f\") pod \"cinder-db-sync-9nnr7\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226552 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plcp4\" (UniqueName: \"kubernetes.io/projected/9132ca8c-f2de-4025-8462-4899276a8678-kube-api-access-plcp4\") pod \"barbican-db-sync-gg7dx\" (UID: \"9132ca8c-f2de-4025-8462-4899276a8678\") " pod="openstack/barbican-db-sync-gg7dx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226571 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-scripts\") pod \"cinder-db-sync-9nnr7\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226588 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb20acbd-2346-46cd-baba-089a6afed51b-config-data\") pod \"placement-db-sync-phwrx\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " pod="openstack/placement-db-sync-phwrx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226626 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9132ca8c-f2de-4025-8462-4899276a8678-db-sync-config-data\") pod \"barbican-db-sync-gg7dx\" (UID: \"9132ca8c-f2de-4025-8462-4899276a8678\") " pod="openstack/barbican-db-sync-gg7dx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226647 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9132ca8c-f2de-4025-8462-4899276a8678-combined-ca-bundle\") pod \"barbican-db-sync-gg7dx\" (UID: \"9132ca8c-f2de-4025-8462-4899276a8678\") " pod="openstack/barbican-db-sync-gg7dx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226666 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb20acbd-2346-46cd-baba-089a6afed51b-combined-ca-bundle\") pod \"placement-db-sync-phwrx\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " pod="openstack/placement-db-sync-phwrx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226709 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-combined-ca-bundle\") pod \"cinder-db-sync-9nnr7\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226740 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-db-sync-config-data\") pod \"cinder-db-sync-9nnr7\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.226782 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.232170 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.236211 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.237802 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72841340-b4e1-4283-8eb0-10641fb61f62-run-httpd\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.240952 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-scripts\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.242997 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-scripts\") pod \"cinder-db-sync-9nnr7\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.244289 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d602e6f-77e5-4496-b426-2c003dad63e4-etc-machine-id\") pod \"cinder-db-sync-9nnr7\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.246773 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72841340-b4e1-4283-8eb0-10641fb61f62-log-httpd\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.247582 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-config-data\") pod \"cinder-db-sync-9nnr7\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.248229 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-combined-ca-bundle\") pod \"cinder-db-sync-9nnr7\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.250305 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-config-data\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.259060 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-db-sync-config-data\") pod \"cinder-db-sync-9nnr7\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.262588 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-s8v8c"] Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.264926 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhm7f\" (UniqueName: \"kubernetes.io/projected/5d602e6f-77e5-4496-b426-2c003dad63e4-kube-api-access-qhm7f\") pod \"cinder-db-sync-9nnr7\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.285539 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvqt9\" (UniqueName: \"kubernetes.io/projected/72841340-b4e1-4283-8eb0-10641fb61f62-kube-api-access-vvqt9\") pod \"ceilometer-0\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.296600 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pzf48"] Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.297998 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.313371 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pzf48"] Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.328080 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzqbd\" (UniqueName: \"kubernetes.io/projected/cb20acbd-2346-46cd-baba-089a6afed51b-kube-api-access-tzqbd\") pod \"placement-db-sync-phwrx\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " pod="openstack/placement-db-sync-phwrx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.328119 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb20acbd-2346-46cd-baba-089a6afed51b-logs\") pod \"placement-db-sync-phwrx\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " pod="openstack/placement-db-sync-phwrx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.328172 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plcp4\" (UniqueName: \"kubernetes.io/projected/9132ca8c-f2de-4025-8462-4899276a8678-kube-api-access-plcp4\") pod \"barbican-db-sync-gg7dx\" (UID: \"9132ca8c-f2de-4025-8462-4899276a8678\") " pod="openstack/barbican-db-sync-gg7dx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.328200 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb20acbd-2346-46cd-baba-089a6afed51b-config-data\") pod \"placement-db-sync-phwrx\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " pod="openstack/placement-db-sync-phwrx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.328217 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9132ca8c-f2de-4025-8462-4899276a8678-db-sync-config-data\") pod \"barbican-db-sync-gg7dx\" (UID: \"9132ca8c-f2de-4025-8462-4899276a8678\") " pod="openstack/barbican-db-sync-gg7dx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.328237 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9132ca8c-f2de-4025-8462-4899276a8678-combined-ca-bundle\") pod \"barbican-db-sync-gg7dx\" (UID: \"9132ca8c-f2de-4025-8462-4899276a8678\") " pod="openstack/barbican-db-sync-gg7dx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.328256 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb20acbd-2346-46cd-baba-089a6afed51b-combined-ca-bundle\") pod \"placement-db-sync-phwrx\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " pod="openstack/placement-db-sync-phwrx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.328353 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb20acbd-2346-46cd-baba-089a6afed51b-scripts\") pod \"placement-db-sync-phwrx\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " pod="openstack/placement-db-sync-phwrx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.329134 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb20acbd-2346-46cd-baba-089a6afed51b-logs\") pod \"placement-db-sync-phwrx\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " pod="openstack/placement-db-sync-phwrx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.332568 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb20acbd-2346-46cd-baba-089a6afed51b-scripts\") pod \"placement-db-sync-phwrx\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " pod="openstack/placement-db-sync-phwrx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.333085 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb20acbd-2346-46cd-baba-089a6afed51b-combined-ca-bundle\") pod \"placement-db-sync-phwrx\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " pod="openstack/placement-db-sync-phwrx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.334142 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb20acbd-2346-46cd-baba-089a6afed51b-config-data\") pod \"placement-db-sync-phwrx\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " pod="openstack/placement-db-sync-phwrx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.338236 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9132ca8c-f2de-4025-8462-4899276a8678-db-sync-config-data\") pod \"barbican-db-sync-gg7dx\" (UID: \"9132ca8c-f2de-4025-8462-4899276a8678\") " pod="openstack/barbican-db-sync-gg7dx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.340184 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ppvgx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.342507 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9132ca8c-f2de-4025-8462-4899276a8678-combined-ca-bundle\") pod \"barbican-db-sync-gg7dx\" (UID: \"9132ca8c-f2de-4025-8462-4899276a8678\") " pod="openstack/barbican-db-sync-gg7dx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.350214 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plcp4\" (UniqueName: \"kubernetes.io/projected/9132ca8c-f2de-4025-8462-4899276a8678-kube-api-access-plcp4\") pod \"barbican-db-sync-gg7dx\" (UID: \"9132ca8c-f2de-4025-8462-4899276a8678\") " pod="openstack/barbican-db-sync-gg7dx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.372296 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzqbd\" (UniqueName: \"kubernetes.io/projected/cb20acbd-2346-46cd-baba-089a6afed51b-kube-api-access-tzqbd\") pod \"placement-db-sync-phwrx\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " pod="openstack/placement-db-sync-phwrx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.429818 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-pzf48\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.429868 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-pzf48\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.430586 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-pzf48\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.430625 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv56c\" (UniqueName: \"kubernetes.io/projected/489aeb87-1810-4eab-adbb-a0047e598344-kube-api-access-fv56c\") pod \"dnsmasq-dns-785d8bcb8c-pzf48\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.430695 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-config\") pod \"dnsmasq-dns-785d8bcb8c-pzf48\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.430781 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-pzf48\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.513531 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.539981 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-pzf48\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.540034 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-pzf48\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.540068 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv56c\" (UniqueName: \"kubernetes.io/projected/489aeb87-1810-4eab-adbb-a0047e598344-kube-api-access-fv56c\") pod \"dnsmasq-dns-785d8bcb8c-pzf48\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.540452 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-config\") pod \"dnsmasq-dns-785d8bcb8c-pzf48\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.540579 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-pzf48\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.540733 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-pzf48\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.541572 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-config\") pod \"dnsmasq-dns-785d8bcb8c-pzf48\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.548284 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-pzf48\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.548508 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-pzf48\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.548688 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-pzf48\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.548764 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-pzf48\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.555862 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv56c\" (UniqueName: \"kubernetes.io/projected/489aeb87-1810-4eab-adbb-a0047e598344-kube-api-access-fv56c\") pod \"dnsmasq-dns-785d8bcb8c-pzf48\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.564689 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.570323 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gg7dx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.590356 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-phwrx" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.614947 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.652803 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bqgds"] Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.856164 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-s8v8c"] Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.867973 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-q6zfh"] Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.872960 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.874965 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.877011 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.877035 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-nwmvp" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.880897 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.888119 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.947968 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.948175 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.948286 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.948372 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.948446 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.948515 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-logs\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.948618 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbqcr\" (UniqueName: \"kubernetes.io/projected/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-kube-api-access-dbqcr\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:28 crc kubenswrapper[4841]: I0313 09:31:28.970214 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-ppvgx"] Mar 13 09:31:28 crc kubenswrapper[4841]: W0313 09:31:28.986137 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e627e4b_0a39_42af_be45_147ff230fd13.slice/crio-dce3abeb728846b745aabf912a55f32dd8b79a4c4388a679b7bb2eaa43b3e4fe WatchSource:0}: Error finding container dce3abeb728846b745aabf912a55f32dd8b79a4c4388a679b7bb2eaa43b3e4fe: Status 404 returned error can't find the container with id dce3abeb728846b745aabf912a55f32dd8b79a4c4388a679b7bb2eaa43b3e4fe Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.006122 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.009653 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.013173 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.028115 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.050375 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-logs\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.050670 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.050752 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbqcr\" (UniqueName: \"kubernetes.io/projected/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-kube-api-access-dbqcr\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.050779 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.050845 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-logs\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.050866 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-config-data\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.050914 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.050937 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.050952 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-scripts\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.050975 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxp8s\" (UniqueName: \"kubernetes.io/projected/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-kube-api-access-cxp8s\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.050992 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.051041 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.051056 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.051083 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.052248 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-logs\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.052760 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.053802 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.060037 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.061561 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.063108 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.067150 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbqcr\" (UniqueName: \"kubernetes.io/projected/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-kube-api-access-dbqcr\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.092980 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.111480 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.152235 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-config-data\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.152320 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.152342 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-scripts\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.152368 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxp8s\" (UniqueName: \"kubernetes.io/projected/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-kube-api-access-cxp8s\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.152418 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.152456 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.152500 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-logs\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.152878 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-logs\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.153134 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.153327 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.159622 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.160565 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-scripts\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.163120 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-config-data\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.167571 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxp8s\" (UniqueName: \"kubernetes.io/projected/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-kube-api-access-cxp8s\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.180880 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.207876 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.214296 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9nnr7"] Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.256786 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.312232 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gg7dx"] Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.419011 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pzf48"] Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.434251 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-phwrx"] Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.455741 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72841340-b4e1-4283-8eb0-10641fb61f62","Type":"ContainerStarted","Data":"314e66748aa4a3184d585c74f92f37c74e4269b18adc3facf3453d706f8f1d9f"} Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.465749 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9nnr7" event={"ID":"5d602e6f-77e5-4496-b426-2c003dad63e4","Type":"ContainerStarted","Data":"067cc3a2a3c41d4b61f67df2132488adc85c1d0a138838543c38e05474cf96c0"} Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.468208 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bqgds" event={"ID":"2198213d-5d6d-4438-a171-82745eb1f35f","Type":"ContainerStarted","Data":"fcc0f5be86f2b0b69e3cb23458c7bfde9de25b14095a0935003dd99ab7c1dc29"} Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.468233 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bqgds" event={"ID":"2198213d-5d6d-4438-a171-82745eb1f35f","Type":"ContainerStarted","Data":"a11e67e228d8811fbcd4a914fcb1bf80e261025c6ddae055248a8b68e1b25166"} Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.471793 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ppvgx" event={"ID":"2e627e4b-0a39-42af-be45-147ff230fd13","Type":"ContainerStarted","Data":"6b6a3e351fb7f6d9a0b8d502a1bdcd8091852d33875656e47426bdce532fa49c"} Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.471829 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ppvgx" event={"ID":"2e627e4b-0a39-42af-be45-147ff230fd13","Type":"ContainerStarted","Data":"dce3abeb728846b745aabf912a55f32dd8b79a4c4388a679b7bb2eaa43b3e4fe"} Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.479736 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gg7dx" event={"ID":"9132ca8c-f2de-4025-8462-4899276a8678","Type":"ContainerStarted","Data":"849b29aa05af1d8195aad8aa53c63df15de5a1814e486acab5f9d23939c040d0"} Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.490152 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bqgds" podStartSLOduration=2.490135496 podStartE2EDuration="2.490135496s" podCreationTimestamp="2026-03-13 09:31:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:31:29.483442962 +0000 UTC m=+1172.213343153" watchObservedRunningTime="2026-03-13 09:31:29.490135496 +0000 UTC m=+1172.220035687" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.491956 4841 generic.go:334] "Generic (PLEG): container finished" podID="0d0fe070-7383-427a-895a-f4116dd15ec3" containerID="a5ddcccf9c86651dfbd2a4919398459d925f127de0a054d01d02b534b311666c" exitCode=0 Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.492206 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" event={"ID":"0d0fe070-7383-427a-895a-f4116dd15ec3","Type":"ContainerDied","Data":"a5ddcccf9c86651dfbd2a4919398459d925f127de0a054d01d02b534b311666c"} Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.492259 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" event={"ID":"0d0fe070-7383-427a-895a-f4116dd15ec3","Type":"ContainerStarted","Data":"8895565bcc68436c23d449152aeb18d7a9b1bd09a776a37dc788f3e2effe31bf"} Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.498878 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" podUID="df9290ff-b191-4f7f-b013-d4bdd2be42db" containerName="dnsmasq-dns" containerID="cri-o://f3627f52358b56a53f859ed7cf48322d3e9335c57585dd4021ba3613ebe35b52" gracePeriod=10 Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.499239 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-q6zfh" event={"ID":"2038e7ba-1de4-49b4-95dd-b2f3cde7be45","Type":"ContainerStarted","Data":"91bc4d009ae0adb93dc9095a1922dc6b2e0981ac31351d5e1d236cf9279cb77e"} Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.500367 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-ppvgx" podStartSLOduration=2.500355059 podStartE2EDuration="2.500355059s" podCreationTimestamp="2026-03-13 09:31:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:31:29.499391129 +0000 UTC m=+1172.229291320" watchObservedRunningTime="2026-03-13 09:31:29.500355059 +0000 UTC m=+1172.230255260" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.678062 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.785091 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.868100 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.971395 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-dns-svc\") pod \"0d0fe070-7383-427a-895a-f4116dd15ec3\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.971492 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkv4n\" (UniqueName: \"kubernetes.io/projected/0d0fe070-7383-427a-895a-f4116dd15ec3-kube-api-access-tkv4n\") pod \"0d0fe070-7383-427a-895a-f4116dd15ec3\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.971683 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-ovsdbserver-sb\") pod \"0d0fe070-7383-427a-895a-f4116dd15ec3\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.971887 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-ovsdbserver-nb\") pod \"0d0fe070-7383-427a-895a-f4116dd15ec3\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.971932 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-dns-swift-storage-0\") pod \"0d0fe070-7383-427a-895a-f4116dd15ec3\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.972009 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-config\") pod \"0d0fe070-7383-427a-895a-f4116dd15ec3\" (UID: \"0d0fe070-7383-427a-895a-f4116dd15ec3\") " Mar 13 09:31:29 crc kubenswrapper[4841]: I0313 09:31:29.991622 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d0fe070-7383-427a-895a-f4116dd15ec3-kube-api-access-tkv4n" (OuterVolumeSpecName: "kube-api-access-tkv4n") pod "0d0fe070-7383-427a-895a-f4116dd15ec3" (UID: "0d0fe070-7383-427a-895a-f4116dd15ec3"). InnerVolumeSpecName "kube-api-access-tkv4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.007509 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0d0fe070-7383-427a-895a-f4116dd15ec3" (UID: "0d0fe070-7383-427a-895a-f4116dd15ec3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.017978 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0d0fe070-7383-427a-895a-f4116dd15ec3" (UID: "0d0fe070-7383-427a-895a-f4116dd15ec3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.033930 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-config" (OuterVolumeSpecName: "config") pod "0d0fe070-7383-427a-895a-f4116dd15ec3" (UID: "0d0fe070-7383-427a-895a-f4116dd15ec3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.044060 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0d0fe070-7383-427a-895a-f4116dd15ec3" (UID: "0d0fe070-7383-427a-895a-f4116dd15ec3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.062972 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0d0fe070-7383-427a-895a-f4116dd15ec3" (UID: "0d0fe070-7383-427a-895a-f4116dd15ec3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.074390 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.074429 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.074440 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkv4n\" (UniqueName: \"kubernetes.io/projected/0d0fe070-7383-427a-895a-f4116dd15ec3-kube-api-access-tkv4n\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.074453 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.074462 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.074470 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d0fe070-7383-427a-895a-f4116dd15ec3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.162870 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.277909 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-ovsdbserver-sb\") pod \"df9290ff-b191-4f7f-b013-d4bdd2be42db\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.278046 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-ovsdbserver-nb\") pod \"df9290ff-b191-4f7f-b013-d4bdd2be42db\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.278063 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-dns-swift-storage-0\") pod \"df9290ff-b191-4f7f-b013-d4bdd2be42db\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.278141 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-config\") pod \"df9290ff-b191-4f7f-b013-d4bdd2be42db\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.278200 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn6nh\" (UniqueName: \"kubernetes.io/projected/df9290ff-b191-4f7f-b013-d4bdd2be42db-kube-api-access-zn6nh\") pod \"df9290ff-b191-4f7f-b013-d4bdd2be42db\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.278221 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-dns-svc\") pod \"df9290ff-b191-4f7f-b013-d4bdd2be42db\" (UID: \"df9290ff-b191-4f7f-b013-d4bdd2be42db\") " Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.302987 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df9290ff-b191-4f7f-b013-d4bdd2be42db-kube-api-access-zn6nh" (OuterVolumeSpecName: "kube-api-access-zn6nh") pod "df9290ff-b191-4f7f-b013-d4bdd2be42db" (UID: "df9290ff-b191-4f7f-b013-d4bdd2be42db"). InnerVolumeSpecName "kube-api-access-zn6nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.380795 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn6nh\" (UniqueName: \"kubernetes.io/projected/df9290ff-b191-4f7f-b013-d4bdd2be42db-kube-api-access-zn6nh\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.412780 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df9290ff-b191-4f7f-b013-d4bdd2be42db" (UID: "df9290ff-b191-4f7f-b013-d4bdd2be42db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.413500 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "df9290ff-b191-4f7f-b013-d4bdd2be42db" (UID: "df9290ff-b191-4f7f-b013-d4bdd2be42db"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.449452 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df9290ff-b191-4f7f-b013-d4bdd2be42db" (UID: "df9290ff-b191-4f7f-b013-d4bdd2be42db"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.463705 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-config" (OuterVolumeSpecName: "config") pod "df9290ff-b191-4f7f-b013-d4bdd2be42db" (UID: "df9290ff-b191-4f7f-b013-d4bdd2be42db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.464251 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df9290ff-b191-4f7f-b013-d4bdd2be42db" (UID: "df9290ff-b191-4f7f-b013-d4bdd2be42db"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.484923 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.484952 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.484961 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.484969 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.484977 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df9290ff-b191-4f7f-b013-d4bdd2be42db-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.543602 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-phwrx" event={"ID":"cb20acbd-2346-46cd-baba-089a6afed51b","Type":"ContainerStarted","Data":"55d061398e701f686edf01ee875511b4193db71c211fa764da073dddd6514f35"} Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.550064 4841 generic.go:334] "Generic (PLEG): container finished" podID="df9290ff-b191-4f7f-b013-d4bdd2be42db" containerID="f3627f52358b56a53f859ed7cf48322d3e9335c57585dd4021ba3613ebe35b52" exitCode=0 Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.550122 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" event={"ID":"df9290ff-b191-4f7f-b013-d4bdd2be42db","Type":"ContainerDied","Data":"f3627f52358b56a53f859ed7cf48322d3e9335c57585dd4021ba3613ebe35b52"} Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.550149 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" event={"ID":"df9290ff-b191-4f7f-b013-d4bdd2be42db","Type":"ContainerDied","Data":"bfe701c9c4ab4ae3e095c71be0859440cd7894744baecd5dfc868d314fc89486"} Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.550164 4841 scope.go:117] "RemoveContainer" containerID="f3627f52358b56a53f859ed7cf48322d3e9335c57585dd4021ba3613ebe35b52" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.550286 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-zxvqm" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.555242 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" event={"ID":"0d0fe070-7383-427a-895a-f4116dd15ec3","Type":"ContainerDied","Data":"8895565bcc68436c23d449152aeb18d7a9b1bd09a776a37dc788f3e2effe31bf"} Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.555330 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-s8v8c" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.561894 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816","Type":"ContainerStarted","Data":"27c0b9cebfcd6a92dacb9581d443c6ec7b912bd38e4c03b2a8722e082230c4a5"} Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.573584 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0481806-83ad-4ca8-9dd5-24d42e7a50a5","Type":"ContainerStarted","Data":"b34c9cffdff227149eed30453e40da3173d23f390aec1b0b57dcee53505197d2"} Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.590783 4841 generic.go:334] "Generic (PLEG): container finished" podID="489aeb87-1810-4eab-adbb-a0047e598344" containerID="4a7f6c9f38ac2e98f09d87fb8346377e5bc3202134f1fdbdc915cbd10a5b114e" exitCode=0 Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.591620 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" event={"ID":"489aeb87-1810-4eab-adbb-a0047e598344","Type":"ContainerDied","Data":"4a7f6c9f38ac2e98f09d87fb8346377e5bc3202134f1fdbdc915cbd10a5b114e"} Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.591650 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" event={"ID":"489aeb87-1810-4eab-adbb-a0047e598344","Type":"ContainerStarted","Data":"ff0c58ea729ca5a5c91beaafe247c89a850dbe352ef191bd40bc8f7aa3facd20"} Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.670923 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-s8v8c"] Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.682767 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-s8v8c"] Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.695313 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-zxvqm"] Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.695455 4841 scope.go:117] "RemoveContainer" containerID="25ccf9465768e8d7df84692f3278c200307657facc2671b368e9cea6c3faf721" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.706809 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-zxvqm"] Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.763684 4841 scope.go:117] "RemoveContainer" containerID="f3627f52358b56a53f859ed7cf48322d3e9335c57585dd4021ba3613ebe35b52" Mar 13 09:31:30 crc kubenswrapper[4841]: E0313 09:31:30.766712 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3627f52358b56a53f859ed7cf48322d3e9335c57585dd4021ba3613ebe35b52\": container with ID starting with f3627f52358b56a53f859ed7cf48322d3e9335c57585dd4021ba3613ebe35b52 not found: ID does not exist" containerID="f3627f52358b56a53f859ed7cf48322d3e9335c57585dd4021ba3613ebe35b52" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.766751 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3627f52358b56a53f859ed7cf48322d3e9335c57585dd4021ba3613ebe35b52"} err="failed to get container status \"f3627f52358b56a53f859ed7cf48322d3e9335c57585dd4021ba3613ebe35b52\": rpc error: code = NotFound desc = could not find container \"f3627f52358b56a53f859ed7cf48322d3e9335c57585dd4021ba3613ebe35b52\": container with ID starting with f3627f52358b56a53f859ed7cf48322d3e9335c57585dd4021ba3613ebe35b52 not found: ID does not exist" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.766774 4841 scope.go:117] "RemoveContainer" containerID="25ccf9465768e8d7df84692f3278c200307657facc2671b368e9cea6c3faf721" Mar 13 09:31:30 crc kubenswrapper[4841]: E0313 09:31:30.767300 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25ccf9465768e8d7df84692f3278c200307657facc2671b368e9cea6c3faf721\": container with ID starting with 25ccf9465768e8d7df84692f3278c200307657facc2671b368e9cea6c3faf721 not found: ID does not exist" containerID="25ccf9465768e8d7df84692f3278c200307657facc2671b368e9cea6c3faf721" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.767319 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ccf9465768e8d7df84692f3278c200307657facc2671b368e9cea6c3faf721"} err="failed to get container status \"25ccf9465768e8d7df84692f3278c200307657facc2671b368e9cea6c3faf721\": rpc error: code = NotFound desc = could not find container \"25ccf9465768e8d7df84692f3278c200307657facc2671b368e9cea6c3faf721\": container with ID starting with 25ccf9465768e8d7df84692f3278c200307657facc2671b368e9cea6c3faf721 not found: ID does not exist" Mar 13 09:31:30 crc kubenswrapper[4841]: I0313 09:31:30.767331 4841 scope.go:117] "RemoveContainer" containerID="a5ddcccf9c86651dfbd2a4919398459d925f127de0a054d01d02b534b311666c" Mar 13 09:31:31 crc kubenswrapper[4841]: I0313 09:31:31.577830 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 09:31:31 crc kubenswrapper[4841]: I0313 09:31:31.641585 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:31:31 crc kubenswrapper[4841]: I0313 09:31:31.650088 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0481806-83ad-4ca8-9dd5-24d42e7a50a5","Type":"ContainerStarted","Data":"bdf4d53a921adb8079df99f0ca74217c0ac70d9d3df664b0f5ff9a58341e2d1b"} Mar 13 09:31:31 crc kubenswrapper[4841]: I0313 09:31:31.653384 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" event={"ID":"489aeb87-1810-4eab-adbb-a0047e598344","Type":"ContainerStarted","Data":"a2361b6a34a9cfff36a07843d7a39d655ab2abe7d7c2668e0a687c1216842c60"} Mar 13 09:31:31 crc kubenswrapper[4841]: I0313 09:31:31.653518 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:31 crc kubenswrapper[4841]: I0313 09:31:31.653868 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:31:31 crc kubenswrapper[4841]: I0313 09:31:31.669473 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816","Type":"ContainerStarted","Data":"af4045b148d6e0d91645e878fe95ff87e1b0e01937703e0579c6bb0d618dc613"} Mar 13 09:31:32 crc kubenswrapper[4841]: I0313 09:31:32.011198 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d0fe070-7383-427a-895a-f4116dd15ec3" path="/var/lib/kubelet/pods/0d0fe070-7383-427a-895a-f4116dd15ec3/volumes" Mar 13 09:31:32 crc kubenswrapper[4841]: I0313 09:31:32.011747 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df9290ff-b191-4f7f-b013-d4bdd2be42db" path="/var/lib/kubelet/pods/df9290ff-b191-4f7f-b013-d4bdd2be42db/volumes" Mar 13 09:31:32 crc kubenswrapper[4841]: I0313 09:31:32.682548 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816","Type":"ContainerStarted","Data":"3b468631591614af9bd493bfe06ffea66de2f415e97c03ea9b1d579d93349050"} Mar 13 09:31:32 crc kubenswrapper[4841]: I0313 09:31:32.682570 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816" containerName="glance-log" containerID="cri-o://af4045b148d6e0d91645e878fe95ff87e1b0e01937703e0579c6bb0d618dc613" gracePeriod=30 Mar 13 09:31:32 crc kubenswrapper[4841]: I0313 09:31:32.682607 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816" containerName="glance-httpd" containerID="cri-o://3b468631591614af9bd493bfe06ffea66de2f415e97c03ea9b1d579d93349050" gracePeriod=30 Mar 13 09:31:32 crc kubenswrapper[4841]: I0313 09:31:32.685231 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0481806-83ad-4ca8-9dd5-24d42e7a50a5","Type":"ContainerStarted","Data":"55e9cfeeed408749abd1007c1a3ac1b5a40b9e02543581d26e4da52ed86297b8"} Mar 13 09:31:32 crc kubenswrapper[4841]: I0313 09:31:32.685372 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a0481806-83ad-4ca8-9dd5-24d42e7a50a5" containerName="glance-httpd" containerID="cri-o://55e9cfeeed408749abd1007c1a3ac1b5a40b9e02543581d26e4da52ed86297b8" gracePeriod=30 Mar 13 09:31:32 crc kubenswrapper[4841]: I0313 09:31:32.685336 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a0481806-83ad-4ca8-9dd5-24d42e7a50a5" containerName="glance-log" containerID="cri-o://bdf4d53a921adb8079df99f0ca74217c0ac70d9d3df664b0f5ff9a58341e2d1b" gracePeriod=30 Mar 13 09:31:32 crc kubenswrapper[4841]: I0313 09:31:32.705216 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.705198989 podStartE2EDuration="5.705198989s" podCreationTimestamp="2026-03-13 09:31:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:31:32.703806856 +0000 UTC m=+1175.433707047" watchObservedRunningTime="2026-03-13 09:31:32.705198989 +0000 UTC m=+1175.435099180" Mar 13 09:31:32 crc kubenswrapper[4841]: I0313 09:31:32.705858 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" podStartSLOduration=4.70585448 podStartE2EDuration="4.70585448s" podCreationTimestamp="2026-03-13 09:31:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:31:31.69071703 +0000 UTC m=+1174.420617231" watchObservedRunningTime="2026-03-13 09:31:32.70585448 +0000 UTC m=+1175.435754671" Mar 13 09:31:32 crc kubenswrapper[4841]: I0313 09:31:32.730498 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.730480432 podStartE2EDuration="5.730480432s" podCreationTimestamp="2026-03-13 09:31:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:31:32.72745866 +0000 UTC m=+1175.457358861" watchObservedRunningTime="2026-03-13 09:31:32.730480432 +0000 UTC m=+1175.460380623" Mar 13 09:31:32 crc kubenswrapper[4841]: E0313 09:31:32.988415 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0481806_83ad_4ca8_9dd5_24d42e7a50a5.slice/crio-55e9cfeeed408749abd1007c1a3ac1b5a40b9e02543581d26e4da52ed86297b8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0481806_83ad_4ca8_9dd5_24d42e7a50a5.slice/crio-conmon-55e9cfeeed408749abd1007c1a3ac1b5a40b9e02543581d26e4da52ed86297b8.scope\": RecentStats: unable to find data in memory cache]" Mar 13 09:31:33 crc kubenswrapper[4841]: I0313 09:31:33.700576 4841 generic.go:334] "Generic (PLEG): container finished" podID="a0481806-83ad-4ca8-9dd5-24d42e7a50a5" containerID="55e9cfeeed408749abd1007c1a3ac1b5a40b9e02543581d26e4da52ed86297b8" exitCode=0 Mar 13 09:31:33 crc kubenswrapper[4841]: I0313 09:31:33.700608 4841 generic.go:334] "Generic (PLEG): container finished" podID="a0481806-83ad-4ca8-9dd5-24d42e7a50a5" containerID="bdf4d53a921adb8079df99f0ca74217c0ac70d9d3df664b0f5ff9a58341e2d1b" exitCode=143 Mar 13 09:31:33 crc kubenswrapper[4841]: I0313 09:31:33.700618 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0481806-83ad-4ca8-9dd5-24d42e7a50a5","Type":"ContainerDied","Data":"55e9cfeeed408749abd1007c1a3ac1b5a40b9e02543581d26e4da52ed86297b8"} Mar 13 09:31:33 crc kubenswrapper[4841]: I0313 09:31:33.700669 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0481806-83ad-4ca8-9dd5-24d42e7a50a5","Type":"ContainerDied","Data":"bdf4d53a921adb8079df99f0ca74217c0ac70d9d3df664b0f5ff9a58341e2d1b"} Mar 13 09:31:33 crc kubenswrapper[4841]: I0313 09:31:33.703830 4841 generic.go:334] "Generic (PLEG): container finished" podID="16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816" containerID="3b468631591614af9bd493bfe06ffea66de2f415e97c03ea9b1d579d93349050" exitCode=0 Mar 13 09:31:33 crc kubenswrapper[4841]: I0313 09:31:33.703850 4841 generic.go:334] "Generic (PLEG): container finished" podID="16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816" containerID="af4045b148d6e0d91645e878fe95ff87e1b0e01937703e0579c6bb0d618dc613" exitCode=143 Mar 13 09:31:33 crc kubenswrapper[4841]: I0313 09:31:33.703866 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816","Type":"ContainerDied","Data":"3b468631591614af9bd493bfe06ffea66de2f415e97c03ea9b1d579d93349050"} Mar 13 09:31:33 crc kubenswrapper[4841]: I0313 09:31:33.703881 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816","Type":"ContainerDied","Data":"af4045b148d6e0d91645e878fe95ff87e1b0e01937703e0579c6bb0d618dc613"} Mar 13 09:31:34 crc kubenswrapper[4841]: I0313 09:31:34.407648 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:31:34 crc kubenswrapper[4841]: I0313 09:31:34.407708 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:31:34 crc kubenswrapper[4841]: I0313 09:31:34.720101 4841 generic.go:334] "Generic (PLEG): container finished" podID="2198213d-5d6d-4438-a171-82745eb1f35f" containerID="fcc0f5be86f2b0b69e3cb23458c7bfde9de25b14095a0935003dd99ab7c1dc29" exitCode=0 Mar 13 09:31:34 crc kubenswrapper[4841]: I0313 09:31:34.720142 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bqgds" event={"ID":"2198213d-5d6d-4438-a171-82745eb1f35f","Type":"ContainerDied","Data":"fcc0f5be86f2b0b69e3cb23458c7bfde9de25b14095a0935003dd99ab7c1dc29"} Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.286148 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.326272 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-scripts\") pod \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.326348 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbqcr\" (UniqueName: \"kubernetes.io/projected/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-kube-api-access-dbqcr\") pod \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.326391 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.326414 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-combined-ca-bundle\") pod \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.326479 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-config-data\") pod \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.326507 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-logs\") pod \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.326544 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-httpd-run\") pod \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\" (UID: \"a0481806-83ad-4ca8-9dd5-24d42e7a50a5\") " Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.327155 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a0481806-83ad-4ca8-9dd5-24d42e7a50a5" (UID: "a0481806-83ad-4ca8-9dd5-24d42e7a50a5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.327280 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-logs" (OuterVolumeSpecName: "logs") pod "a0481806-83ad-4ca8-9dd5-24d42e7a50a5" (UID: "a0481806-83ad-4ca8-9dd5-24d42e7a50a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.333362 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "a0481806-83ad-4ca8-9dd5-24d42e7a50a5" (UID: "a0481806-83ad-4ca8-9dd5-24d42e7a50a5"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.334432 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-scripts" (OuterVolumeSpecName: "scripts") pod "a0481806-83ad-4ca8-9dd5-24d42e7a50a5" (UID: "a0481806-83ad-4ca8-9dd5-24d42e7a50a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.334582 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-kube-api-access-dbqcr" (OuterVolumeSpecName: "kube-api-access-dbqcr") pod "a0481806-83ad-4ca8-9dd5-24d42e7a50a5" (UID: "a0481806-83ad-4ca8-9dd5-24d42e7a50a5"). InnerVolumeSpecName "kube-api-access-dbqcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.355688 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0481806-83ad-4ca8-9dd5-24d42e7a50a5" (UID: "a0481806-83ad-4ca8-9dd5-24d42e7a50a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.379253 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-config-data" (OuterVolumeSpecName: "config-data") pod "a0481806-83ad-4ca8-9dd5-24d42e7a50a5" (UID: "a0481806-83ad-4ca8-9dd5-24d42e7a50a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.430949 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.434386 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbqcr\" (UniqueName: \"kubernetes.io/projected/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-kube-api-access-dbqcr\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.434551 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.434645 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.434745 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.434882 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-logs\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.434986 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0481806-83ad-4ca8-9dd5-24d42e7a50a5-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.456319 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.536657 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.733188 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.733192 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0481806-83ad-4ca8-9dd5-24d42e7a50a5","Type":"ContainerDied","Data":"b34c9cffdff227149eed30453e40da3173d23f390aec1b0b57dcee53505197d2"} Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.733315 4841 scope.go:117] "RemoveContainer" containerID="55e9cfeeed408749abd1007c1a3ac1b5a40b9e02543581d26e4da52ed86297b8" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.780044 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.794542 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.810105 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:31:35 crc kubenswrapper[4841]: E0313 09:31:35.810672 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df9290ff-b191-4f7f-b013-d4bdd2be42db" containerName="dnsmasq-dns" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.810697 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="df9290ff-b191-4f7f-b013-d4bdd2be42db" containerName="dnsmasq-dns" Mar 13 09:31:35 crc kubenswrapper[4841]: E0313 09:31:35.810725 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0481806-83ad-4ca8-9dd5-24d42e7a50a5" containerName="glance-log" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.810735 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0481806-83ad-4ca8-9dd5-24d42e7a50a5" containerName="glance-log" Mar 13 09:31:35 crc kubenswrapper[4841]: E0313 09:31:35.810744 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0481806-83ad-4ca8-9dd5-24d42e7a50a5" containerName="glance-httpd" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.810752 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0481806-83ad-4ca8-9dd5-24d42e7a50a5" containerName="glance-httpd" Mar 13 09:31:35 crc kubenswrapper[4841]: E0313 09:31:35.810787 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df9290ff-b191-4f7f-b013-d4bdd2be42db" containerName="init" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.810794 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="df9290ff-b191-4f7f-b013-d4bdd2be42db" containerName="init" Mar 13 09:31:35 crc kubenswrapper[4841]: E0313 09:31:35.810812 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d0fe070-7383-427a-895a-f4116dd15ec3" containerName="init" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.810820 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d0fe070-7383-427a-895a-f4116dd15ec3" containerName="init" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.811017 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0481806-83ad-4ca8-9dd5-24d42e7a50a5" containerName="glance-httpd" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.811038 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="df9290ff-b191-4f7f-b013-d4bdd2be42db" containerName="dnsmasq-dns" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.811048 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d0fe070-7383-427a-895a-f4116dd15ec3" containerName="init" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.811056 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0481806-83ad-4ca8-9dd5-24d42e7a50a5" containerName="glance-log" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.812155 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.815108 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.830098 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.858362 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29852c1c-4596-43eb-b4ee-3e08ba4763e6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.858439 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29852c1c-4596-43eb-b4ee-3e08ba4763e6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.858626 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96jpn\" (UniqueName: \"kubernetes.io/projected/29852c1c-4596-43eb-b4ee-3e08ba4763e6-kube-api-access-96jpn\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.858661 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.858850 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29852c1c-4596-43eb-b4ee-3e08ba4763e6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.860475 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/29852c1c-4596-43eb-b4ee-3e08ba4763e6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.860516 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29852c1c-4596-43eb-b4ee-3e08ba4763e6-logs\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.926473 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:31:35 crc kubenswrapper[4841]: E0313 09:31:35.927258 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run kube-api-access-96jpn logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="29852c1c-4596-43eb-b4ee-3e08ba4763e6" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.962049 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29852c1c-4596-43eb-b4ee-3e08ba4763e6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.962106 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/29852c1c-4596-43eb-b4ee-3e08ba4763e6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.962123 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29852c1c-4596-43eb-b4ee-3e08ba4763e6-logs\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.962158 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29852c1c-4596-43eb-b4ee-3e08ba4763e6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.962176 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29852c1c-4596-43eb-b4ee-3e08ba4763e6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.962228 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96jpn\" (UniqueName: \"kubernetes.io/projected/29852c1c-4596-43eb-b4ee-3e08ba4763e6-kube-api-access-96jpn\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.962246 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.962450 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.962976 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29852c1c-4596-43eb-b4ee-3e08ba4763e6-logs\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.963120 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/29852c1c-4596-43eb-b4ee-3e08ba4763e6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.971103 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29852c1c-4596-43eb-b4ee-3e08ba4763e6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.971533 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29852c1c-4596-43eb-b4ee-3e08ba4763e6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.972591 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29852c1c-4596-43eb-b4ee-3e08ba4763e6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:35 crc kubenswrapper[4841]: I0313 09:31:35.979349 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96jpn\" (UniqueName: \"kubernetes.io/projected/29852c1c-4596-43eb-b4ee-3e08ba4763e6-kube-api-access-96jpn\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.009030 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0481806-83ad-4ca8-9dd5-24d42e7a50a5" path="/var/lib/kubelet/pods/a0481806-83ad-4ca8-9dd5-24d42e7a50a5/volumes" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.020154 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.742168 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.762577 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.788106 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96jpn\" (UniqueName: \"kubernetes.io/projected/29852c1c-4596-43eb-b4ee-3e08ba4763e6-kube-api-access-96jpn\") pod \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.788166 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29852c1c-4596-43eb-b4ee-3e08ba4763e6-combined-ca-bundle\") pod \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.788211 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29852c1c-4596-43eb-b4ee-3e08ba4763e6-logs\") pod \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.788247 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/29852c1c-4596-43eb-b4ee-3e08ba4763e6-httpd-run\") pod \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.788348 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29852c1c-4596-43eb-b4ee-3e08ba4763e6-config-data\") pod \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.788411 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29852c1c-4596-43eb-b4ee-3e08ba4763e6-scripts\") pod \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.788477 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\" (UID: \"29852c1c-4596-43eb-b4ee-3e08ba4763e6\") " Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.788863 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29852c1c-4596-43eb-b4ee-3e08ba4763e6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "29852c1c-4596-43eb-b4ee-3e08ba4763e6" (UID: "29852c1c-4596-43eb-b4ee-3e08ba4763e6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.789178 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29852c1c-4596-43eb-b4ee-3e08ba4763e6-logs" (OuterVolumeSpecName: "logs") pod "29852c1c-4596-43eb-b4ee-3e08ba4763e6" (UID: "29852c1c-4596-43eb-b4ee-3e08ba4763e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.789512 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29852c1c-4596-43eb-b4ee-3e08ba4763e6-logs\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.789529 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/29852c1c-4596-43eb-b4ee-3e08ba4763e6-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.791774 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "29852c1c-4596-43eb-b4ee-3e08ba4763e6" (UID: "29852c1c-4596-43eb-b4ee-3e08ba4763e6"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.792224 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29852c1c-4596-43eb-b4ee-3e08ba4763e6-scripts" (OuterVolumeSpecName: "scripts") pod "29852c1c-4596-43eb-b4ee-3e08ba4763e6" (UID: "29852c1c-4596-43eb-b4ee-3e08ba4763e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.794295 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29852c1c-4596-43eb-b4ee-3e08ba4763e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29852c1c-4596-43eb-b4ee-3e08ba4763e6" (UID: "29852c1c-4596-43eb-b4ee-3e08ba4763e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.794248 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29852c1c-4596-43eb-b4ee-3e08ba4763e6-config-data" (OuterVolumeSpecName: "config-data") pod "29852c1c-4596-43eb-b4ee-3e08ba4763e6" (UID: "29852c1c-4596-43eb-b4ee-3e08ba4763e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.794367 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29852c1c-4596-43eb-b4ee-3e08ba4763e6-kube-api-access-96jpn" (OuterVolumeSpecName: "kube-api-access-96jpn") pod "29852c1c-4596-43eb-b4ee-3e08ba4763e6" (UID: "29852c1c-4596-43eb-b4ee-3e08ba4763e6"). InnerVolumeSpecName "kube-api-access-96jpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.891618 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96jpn\" (UniqueName: \"kubernetes.io/projected/29852c1c-4596-43eb-b4ee-3e08ba4763e6-kube-api-access-96jpn\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.891855 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29852c1c-4596-43eb-b4ee-3e08ba4763e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.891865 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29852c1c-4596-43eb-b4ee-3e08ba4763e6-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.891875 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29852c1c-4596-43eb-b4ee-3e08ba4763e6-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.891901 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.916973 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 13 09:31:36 crc kubenswrapper[4841]: I0313 09:31:36.993602 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:37 crc kubenswrapper[4841]: I0313 09:31:37.752497 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:37 crc kubenswrapper[4841]: I0313 09:31:37.826277 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:31:37 crc kubenswrapper[4841]: I0313 09:31:37.834971 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:31:37 crc kubenswrapper[4841]: I0313 09:31:37.844910 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:31:37 crc kubenswrapper[4841]: I0313 09:31:37.847212 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:37 crc kubenswrapper[4841]: I0313 09:31:37.852421 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:31:37 crc kubenswrapper[4841]: I0313 09:31:37.852812 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 09:31:37 crc kubenswrapper[4841]: I0313 09:31:37.853188 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.013217 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.013330 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fr49\" (UniqueName: \"kubernetes.io/projected/300a1817-5e08-4edc-afec-829b69b0e7e9-kube-api-access-5fr49\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.013371 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/300a1817-5e08-4edc-afec-829b69b0e7e9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.013396 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.013437 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.013480 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.013511 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/300a1817-5e08-4edc-afec-829b69b0e7e9-logs\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.013546 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.015018 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29852c1c-4596-43eb-b4ee-3e08ba4763e6" path="/var/lib/kubelet/pods/29852c1c-4596-43eb-b4ee-3e08ba4763e6/volumes" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.115510 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.116342 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/300a1817-5e08-4edc-afec-829b69b0e7e9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.116445 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fr49\" (UniqueName: \"kubernetes.io/projected/300a1817-5e08-4edc-afec-829b69b0e7e9-kube-api-access-5fr49\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.116555 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.116891 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.116893 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/300a1817-5e08-4edc-afec-829b69b0e7e9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.116842 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.117303 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.123408 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/300a1817-5e08-4edc-afec-829b69b0e7e9-logs\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.123651 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.124124 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.124528 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.124868 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.126632 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/300a1817-5e08-4edc-afec-829b69b0e7e9-logs\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.131083 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.147650 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.147925 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fr49\" (UniqueName: \"kubernetes.io/projected/300a1817-5e08-4edc-afec-829b69b0e7e9-kube-api-access-5fr49\") pod \"glance-default-internal-api-0\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.168807 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.621577 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.719382 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2g4hc"] Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.719707 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-2g4hc" podUID="900f1d10-4a56-461f-81cc-caea5f1b88c8" containerName="dnsmasq-dns" containerID="cri-o://cee69fe8c28d2b81682e9ba0d48495a9116e0d5a3d1c6c11623b5e353e338ddb" gracePeriod=10 Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.841729 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.942868 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.942948 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-httpd-run\") pod \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.943078 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-config-data\") pod \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.943129 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxp8s\" (UniqueName: \"kubernetes.io/projected/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-kube-api-access-cxp8s\") pod \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.943157 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-combined-ca-bundle\") pod \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.943201 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-logs\") pod \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.943239 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-scripts\") pod \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\" (UID: \"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816\") " Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.944771 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816" (UID: "16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.944920 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-logs" (OuterVolumeSpecName: "logs") pod "16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816" (UID: "16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.947899 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-kube-api-access-cxp8s" (OuterVolumeSpecName: "kube-api-access-cxp8s") pod "16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816" (UID: "16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816"). InnerVolumeSpecName "kube-api-access-cxp8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.948331 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-scripts" (OuterVolumeSpecName: "scripts") pod "16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816" (UID: "16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.948388 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816" (UID: "16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 09:31:38 crc kubenswrapper[4841]: I0313 09:31:38.995500 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816" (UID: "16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.009244 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-config-data" (OuterVolumeSpecName: "config-data") pod "16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816" (UID: "16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.047557 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.047599 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.047610 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.047619 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxp8s\" (UniqueName: \"kubernetes.io/projected/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-kube-api-access-cxp8s\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.047627 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.047636 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-logs\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.047643 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.066304 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.149053 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.769420 4841 generic.go:334] "Generic (PLEG): container finished" podID="900f1d10-4a56-461f-81cc-caea5f1b88c8" containerID="cee69fe8c28d2b81682e9ba0d48495a9116e0d5a3d1c6c11623b5e353e338ddb" exitCode=0 Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.769729 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2g4hc" event={"ID":"900f1d10-4a56-461f-81cc-caea5f1b88c8","Type":"ContainerDied","Data":"cee69fe8c28d2b81682e9ba0d48495a9116e0d5a3d1c6c11623b5e353e338ddb"} Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.772317 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816","Type":"ContainerDied","Data":"27c0b9cebfcd6a92dacb9581d443c6ec7b912bd38e4c03b2a8722e082230c4a5"} Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.772352 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.824470 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.834661 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.847136 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 09:31:39 crc kubenswrapper[4841]: E0313 09:31:39.847551 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816" containerName="glance-log" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.847568 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816" containerName="glance-log" Mar 13 09:31:39 crc kubenswrapper[4841]: E0313 09:31:39.847577 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816" containerName="glance-httpd" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.847583 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816" containerName="glance-httpd" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.847730 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816" containerName="glance-log" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.847745 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816" containerName="glance-httpd" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.848737 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.851800 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.852033 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.868227 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.960354 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-config-data\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.960409 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef6730be-d6cf-42b1-b356-fa08748e42ef-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.960450 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.960612 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdmkf\" (UniqueName: \"kubernetes.io/projected/ef6730be-d6cf-42b1-b356-fa08748e42ef-kube-api-access-bdmkf\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.960677 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-scripts\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.960756 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.960828 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef6730be-d6cf-42b1-b356-fa08748e42ef-logs\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:39 crc kubenswrapper[4841]: I0313 09:31:39.960869 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.004945 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816" path="/var/lib/kubelet/pods/16f8e6b5-ba53-4cbb-b5b6-cd7fc268b816/volumes" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.062665 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdmkf\" (UniqueName: \"kubernetes.io/projected/ef6730be-d6cf-42b1-b356-fa08748e42ef-kube-api-access-bdmkf\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.062710 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-scripts\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.062751 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.062803 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef6730be-d6cf-42b1-b356-fa08748e42ef-logs\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.062820 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.063834 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-config-data\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.063871 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef6730be-d6cf-42b1-b356-fa08748e42ef-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.063924 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.064408 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.064477 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef6730be-d6cf-42b1-b356-fa08748e42ef-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.064870 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef6730be-d6cf-42b1-b356-fa08748e42ef-logs\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.067828 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-scripts\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.068014 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.068934 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.074733 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-config-data\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.086196 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdmkf\" (UniqueName: \"kubernetes.io/projected/ef6730be-d6cf-42b1-b356-fa08748e42ef-kube-api-access-bdmkf\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.093922 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " pod="openstack/glance-default-external-api-0" Mar 13 09:31:40 crc kubenswrapper[4841]: I0313 09:31:40.225794 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 09:31:45 crc kubenswrapper[4841]: E0313 09:31:45.754906 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 13 09:31:45 crc kubenswrapper[4841]: E0313 09:31:45.755670 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-plcp4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-gg7dx_openstack(9132ca8c-f2de-4025-8462-4899276a8678): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 09:31:45 crc kubenswrapper[4841]: E0313 09:31:45.757804 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-gg7dx" podUID="9132ca8c-f2de-4025-8462-4899276a8678" Mar 13 09:31:45 crc kubenswrapper[4841]: I0313 09:31:45.823008 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bqgds" event={"ID":"2198213d-5d6d-4438-a171-82745eb1f35f","Type":"ContainerDied","Data":"a11e67e228d8811fbcd4a914fcb1bf80e261025c6ddae055248a8b68e1b25166"} Mar 13 09:31:45 crc kubenswrapper[4841]: I0313 09:31:45.823397 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a11e67e228d8811fbcd4a914fcb1bf80e261025c6ddae055248a8b68e1b25166" Mar 13 09:31:45 crc kubenswrapper[4841]: E0313 09:31:45.825062 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-gg7dx" podUID="9132ca8c-f2de-4025-8462-4899276a8678" Mar 13 09:31:45 crc kubenswrapper[4841]: I0313 09:31:45.828579 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:45 crc kubenswrapper[4841]: I0313 09:31:45.974103 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-scripts\") pod \"2198213d-5d6d-4438-a171-82745eb1f35f\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " Mar 13 09:31:45 crc kubenswrapper[4841]: I0313 09:31:45.974180 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq8d9\" (UniqueName: \"kubernetes.io/projected/2198213d-5d6d-4438-a171-82745eb1f35f-kube-api-access-cq8d9\") pod \"2198213d-5d6d-4438-a171-82745eb1f35f\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " Mar 13 09:31:45 crc kubenswrapper[4841]: I0313 09:31:45.974203 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-config-data\") pod \"2198213d-5d6d-4438-a171-82745eb1f35f\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " Mar 13 09:31:45 crc kubenswrapper[4841]: I0313 09:31:45.974251 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-fernet-keys\") pod \"2198213d-5d6d-4438-a171-82745eb1f35f\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " Mar 13 09:31:45 crc kubenswrapper[4841]: I0313 09:31:45.974295 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-credential-keys\") pod \"2198213d-5d6d-4438-a171-82745eb1f35f\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " Mar 13 09:31:45 crc kubenswrapper[4841]: I0313 09:31:45.974342 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-combined-ca-bundle\") pod \"2198213d-5d6d-4438-a171-82745eb1f35f\" (UID: \"2198213d-5d6d-4438-a171-82745eb1f35f\") " Mar 13 09:31:45 crc kubenswrapper[4841]: I0313 09:31:45.981392 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2198213d-5d6d-4438-a171-82745eb1f35f" (UID: "2198213d-5d6d-4438-a171-82745eb1f35f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:45 crc kubenswrapper[4841]: I0313 09:31:45.983825 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2198213d-5d6d-4438-a171-82745eb1f35f" (UID: "2198213d-5d6d-4438-a171-82745eb1f35f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:45 crc kubenswrapper[4841]: I0313 09:31:45.984525 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-scripts" (OuterVolumeSpecName: "scripts") pod "2198213d-5d6d-4438-a171-82745eb1f35f" (UID: "2198213d-5d6d-4438-a171-82745eb1f35f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:45 crc kubenswrapper[4841]: I0313 09:31:45.985412 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2198213d-5d6d-4438-a171-82745eb1f35f-kube-api-access-cq8d9" (OuterVolumeSpecName: "kube-api-access-cq8d9") pod "2198213d-5d6d-4438-a171-82745eb1f35f" (UID: "2198213d-5d6d-4438-a171-82745eb1f35f"). InnerVolumeSpecName "kube-api-access-cq8d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:46 crc kubenswrapper[4841]: I0313 09:31:46.005286 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2198213d-5d6d-4438-a171-82745eb1f35f" (UID: "2198213d-5d6d-4438-a171-82745eb1f35f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:46 crc kubenswrapper[4841]: I0313 09:31:46.016005 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-config-data" (OuterVolumeSpecName: "config-data") pod "2198213d-5d6d-4438-a171-82745eb1f35f" (UID: "2198213d-5d6d-4438-a171-82745eb1f35f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:46 crc kubenswrapper[4841]: I0313 09:31:46.077877 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:46 crc kubenswrapper[4841]: I0313 09:31:46.077951 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq8d9\" (UniqueName: \"kubernetes.io/projected/2198213d-5d6d-4438-a171-82745eb1f35f-kube-api-access-cq8d9\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:46 crc kubenswrapper[4841]: I0313 09:31:46.077969 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:46 crc kubenswrapper[4841]: I0313 09:31:46.078002 4841 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:46 crc kubenswrapper[4841]: I0313 09:31:46.078025 4841 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:46 crc kubenswrapper[4841]: I0313 09:31:46.078048 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2198213d-5d6d-4438-a171-82745eb1f35f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:46 crc kubenswrapper[4841]: E0313 09:31:46.192573 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 13 09:31:46 crc kubenswrapper[4841]: E0313 09:31:46.192810 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndbh5bfh646h8bh5f4h5dh75h685h574h697h5bfh576h578h644hf8h4hbdh66ch8bh544h5cbh546h5dfh5f7h5b9hcfh6h5fch87hbdh546h5d9q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vvqt9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(72841340-b4e1-4283-8eb0-10641fb61f62): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 09:31:46 crc kubenswrapper[4841]: I0313 09:31:46.835604 4841 generic.go:334] "Generic (PLEG): container finished" podID="2e627e4b-0a39-42af-be45-147ff230fd13" containerID="6b6a3e351fb7f6d9a0b8d502a1bdcd8091852d33875656e47426bdce532fa49c" exitCode=0 Mar 13 09:31:46 crc kubenswrapper[4841]: I0313 09:31:46.835694 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bqgds" Mar 13 09:31:46 crc kubenswrapper[4841]: I0313 09:31:46.836350 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ppvgx" event={"ID":"2e627e4b-0a39-42af-be45-147ff230fd13","Type":"ContainerDied","Data":"6b6a3e351fb7f6d9a0b8d502a1bdcd8091852d33875656e47426bdce532fa49c"} Mar 13 09:31:46 crc kubenswrapper[4841]: I0313 09:31:46.921752 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bqgds"] Mar 13 09:31:46 crc kubenswrapper[4841]: I0313 09:31:46.929106 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bqgds"] Mar 13 09:31:46 crc kubenswrapper[4841]: I0313 09:31:46.983206 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-2g4hc" podUID="900f1d10-4a56-461f-81cc-caea5f1b88c8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: i/o timeout" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.018795 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ppjwg"] Mar 13 09:31:47 crc kubenswrapper[4841]: E0313 09:31:47.019232 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2198213d-5d6d-4438-a171-82745eb1f35f" containerName="keystone-bootstrap" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.019247 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2198213d-5d6d-4438-a171-82745eb1f35f" containerName="keystone-bootstrap" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.019427 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="2198213d-5d6d-4438-a171-82745eb1f35f" containerName="keystone-bootstrap" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.019937 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.021733 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.023938 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.024233 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.024484 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-88dmb" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.024650 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.034501 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ppjwg"] Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.098913 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-config-data\") pod \"keystone-bootstrap-ppjwg\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.098960 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-credential-keys\") pod \"keystone-bootstrap-ppjwg\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.099063 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-combined-ca-bundle\") pod \"keystone-bootstrap-ppjwg\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.099187 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-fernet-keys\") pod \"keystone-bootstrap-ppjwg\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.099212 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-scripts\") pod \"keystone-bootstrap-ppjwg\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.099245 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdnsw\" (UniqueName: \"kubernetes.io/projected/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-kube-api-access-mdnsw\") pod \"keystone-bootstrap-ppjwg\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.200652 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-fernet-keys\") pod \"keystone-bootstrap-ppjwg\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.200694 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-scripts\") pod \"keystone-bootstrap-ppjwg\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.200724 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdnsw\" (UniqueName: \"kubernetes.io/projected/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-kube-api-access-mdnsw\") pod \"keystone-bootstrap-ppjwg\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.200780 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-config-data\") pod \"keystone-bootstrap-ppjwg\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.200798 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-credential-keys\") pod \"keystone-bootstrap-ppjwg\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.200834 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-combined-ca-bundle\") pod \"keystone-bootstrap-ppjwg\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.206087 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-config-data\") pod \"keystone-bootstrap-ppjwg\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.206675 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-combined-ca-bundle\") pod \"keystone-bootstrap-ppjwg\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.206805 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-credential-keys\") pod \"keystone-bootstrap-ppjwg\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.213743 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-scripts\") pod \"keystone-bootstrap-ppjwg\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.216641 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdnsw\" (UniqueName: \"kubernetes.io/projected/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-kube-api-access-mdnsw\") pod \"keystone-bootstrap-ppjwg\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.224991 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-fernet-keys\") pod \"keystone-bootstrap-ppjwg\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:47 crc kubenswrapper[4841]: I0313 09:31:47.343598 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:31:48 crc kubenswrapper[4841]: I0313 09:31:48.005290 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2198213d-5d6d-4438-a171-82745eb1f35f" path="/var/lib/kubelet/pods/2198213d-5d6d-4438-a171-82745eb1f35f/volumes" Mar 13 09:31:51 crc kubenswrapper[4841]: I0313 09:31:51.985914 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-2g4hc" podUID="900f1d10-4a56-461f-81cc-caea5f1b88c8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: i/o timeout" Mar 13 09:31:53 crc kubenswrapper[4841]: I0313 09:31:53.813395 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:31:53 crc kubenswrapper[4841]: I0313 09:31:53.896401 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2g4hc" event={"ID":"900f1d10-4a56-461f-81cc-caea5f1b88c8","Type":"ContainerDied","Data":"a6dedda5a1785c6e443bedeb940f4cd9369de74476c9d7095912967fd5410328"} Mar 13 09:31:53 crc kubenswrapper[4841]: I0313 09:31:53.896487 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2g4hc" Mar 13 09:31:53 crc kubenswrapper[4841]: I0313 09:31:53.943720 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-config\") pod \"900f1d10-4a56-461f-81cc-caea5f1b88c8\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " Mar 13 09:31:53 crc kubenswrapper[4841]: I0313 09:31:53.943769 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7j9v\" (UniqueName: \"kubernetes.io/projected/900f1d10-4a56-461f-81cc-caea5f1b88c8-kube-api-access-p7j9v\") pod \"900f1d10-4a56-461f-81cc-caea5f1b88c8\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " Mar 13 09:31:53 crc kubenswrapper[4841]: I0313 09:31:53.943843 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-ovsdbserver-sb\") pod \"900f1d10-4a56-461f-81cc-caea5f1b88c8\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " Mar 13 09:31:53 crc kubenswrapper[4841]: I0313 09:31:53.943882 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-ovsdbserver-nb\") pod \"900f1d10-4a56-461f-81cc-caea5f1b88c8\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " Mar 13 09:31:53 crc kubenswrapper[4841]: I0313 09:31:53.944023 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-dns-svc\") pod \"900f1d10-4a56-461f-81cc-caea5f1b88c8\" (UID: \"900f1d10-4a56-461f-81cc-caea5f1b88c8\") " Mar 13 09:31:53 crc kubenswrapper[4841]: I0313 09:31:53.967345 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/900f1d10-4a56-461f-81cc-caea5f1b88c8-kube-api-access-p7j9v" (OuterVolumeSpecName: "kube-api-access-p7j9v") pod "900f1d10-4a56-461f-81cc-caea5f1b88c8" (UID: "900f1d10-4a56-461f-81cc-caea5f1b88c8"). InnerVolumeSpecName "kube-api-access-p7j9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:53 crc kubenswrapper[4841]: I0313 09:31:53.985588 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "900f1d10-4a56-461f-81cc-caea5f1b88c8" (UID: "900f1d10-4a56-461f-81cc-caea5f1b88c8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:53 crc kubenswrapper[4841]: I0313 09:31:53.992369 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "900f1d10-4a56-461f-81cc-caea5f1b88c8" (UID: "900f1d10-4a56-461f-81cc-caea5f1b88c8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:53 crc kubenswrapper[4841]: I0313 09:31:53.997704 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-config" (OuterVolumeSpecName: "config") pod "900f1d10-4a56-461f-81cc-caea5f1b88c8" (UID: "900f1d10-4a56-461f-81cc-caea5f1b88c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.001719 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "900f1d10-4a56-461f-81cc-caea5f1b88c8" (UID: "900f1d10-4a56-461f-81cc-caea5f1b88c8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.046222 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.046564 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7j9v\" (UniqueName: \"kubernetes.io/projected/900f1d10-4a56-461f-81cc-caea5f1b88c8-kube-api-access-p7j9v\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.046579 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.046592 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.046603 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/900f1d10-4a56-461f-81cc-caea5f1b88c8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:54 crc kubenswrapper[4841]: E0313 09:31:54.162829 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Mar 13 09:31:54 crc kubenswrapper[4841]: E0313 09:31:54.163368 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6gdks,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-q6zfh_openstack(2038e7ba-1de4-49b4-95dd-b2f3cde7be45): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 09:31:54 crc kubenswrapper[4841]: E0313 09:31:54.164568 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-q6zfh" podUID="2038e7ba-1de4-49b4-95dd-b2f3cde7be45" Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.170833 4841 scope.go:117] "RemoveContainer" containerID="bdf4d53a921adb8079df99f0ca74217c0ac70d9d3df664b0f5ff9a58341e2d1b" Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.185559 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ppvgx" Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.224973 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2g4hc"] Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.232632 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2g4hc"] Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.352369 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2e627e4b-0a39-42af-be45-147ff230fd13-config\") pod \"2e627e4b-0a39-42af-be45-147ff230fd13\" (UID: \"2e627e4b-0a39-42af-be45-147ff230fd13\") " Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.352441 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e627e4b-0a39-42af-be45-147ff230fd13-combined-ca-bundle\") pod \"2e627e4b-0a39-42af-be45-147ff230fd13\" (UID: \"2e627e4b-0a39-42af-be45-147ff230fd13\") " Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.352563 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fx9c\" (UniqueName: \"kubernetes.io/projected/2e627e4b-0a39-42af-be45-147ff230fd13-kube-api-access-9fx9c\") pod \"2e627e4b-0a39-42af-be45-147ff230fd13\" (UID: \"2e627e4b-0a39-42af-be45-147ff230fd13\") " Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.356465 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e627e4b-0a39-42af-be45-147ff230fd13-kube-api-access-9fx9c" (OuterVolumeSpecName: "kube-api-access-9fx9c") pod "2e627e4b-0a39-42af-be45-147ff230fd13" (UID: "2e627e4b-0a39-42af-be45-147ff230fd13"). InnerVolumeSpecName "kube-api-access-9fx9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:31:54 crc kubenswrapper[4841]: E0313 09:31:54.371105 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e627e4b-0a39-42af-be45-147ff230fd13-config podName:2e627e4b-0a39-42af-be45-147ff230fd13 nodeName:}" failed. No retries permitted until 2026-03-13 09:31:54.871076317 +0000 UTC m=+1197.600976568 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/secret/2e627e4b-0a39-42af-be45-147ff230fd13-config") pod "2e627e4b-0a39-42af-be45-147ff230fd13" (UID: "2e627e4b-0a39-42af-be45-147ff230fd13") : error deleting /var/lib/kubelet/pods/2e627e4b-0a39-42af-be45-147ff230fd13/volume-subpaths: remove /var/lib/kubelet/pods/2e627e4b-0a39-42af-be45-147ff230fd13/volume-subpaths: no such file or directory Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.374021 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e627e4b-0a39-42af-be45-147ff230fd13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e627e4b-0a39-42af-be45-147ff230fd13" (UID: "2e627e4b-0a39-42af-be45-147ff230fd13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.454036 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e627e4b-0a39-42af-be45-147ff230fd13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.454062 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fx9c\" (UniqueName: \"kubernetes.io/projected/2e627e4b-0a39-42af-be45-147ff230fd13-kube-api-access-9fx9c\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.911430 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ppvgx" event={"ID":"2e627e4b-0a39-42af-be45-147ff230fd13","Type":"ContainerDied","Data":"dce3abeb728846b745aabf912a55f32dd8b79a4c4388a679b7bb2eaa43b3e4fe"} Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.911474 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dce3abeb728846b745aabf912a55f32dd8b79a4c4388a679b7bb2eaa43b3e4fe" Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.911549 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ppvgx" Mar 13 09:31:54 crc kubenswrapper[4841]: E0313 09:31:54.912897 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-q6zfh" podUID="2038e7ba-1de4-49b4-95dd-b2f3cde7be45" Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.962997 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2e627e4b-0a39-42af-be45-147ff230fd13-config\") pod \"2e627e4b-0a39-42af-be45-147ff230fd13\" (UID: \"2e627e4b-0a39-42af-be45-147ff230fd13\") " Mar 13 09:31:54 crc kubenswrapper[4841]: I0313 09:31:54.967480 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e627e4b-0a39-42af-be45-147ff230fd13-config" (OuterVolumeSpecName: "config") pod "2e627e4b-0a39-42af-be45-147ff230fd13" (UID: "2e627e4b-0a39-42af-be45-147ff230fd13"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.064947 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2e627e4b-0a39-42af-be45-147ff230fd13-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.379172 4841 scope.go:117] "RemoveContainer" containerID="3b468631591614af9bd493bfe06ffea66de2f415e97c03ea9b1d579d93349050" Mar 13 09:31:55 crc kubenswrapper[4841]: E0313 09:31:55.379823 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 13 09:31:55 crc kubenswrapper[4841]: E0313 09:31:55.379942 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qhm7f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-9nnr7_openstack(5d602e6f-77e5-4496-b426-2c003dad63e4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 09:31:55 crc kubenswrapper[4841]: E0313 09:31:55.382660 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-9nnr7" podUID="5d602e6f-77e5-4496-b426-2c003dad63e4" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.463789 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-9svmt"] Mar 13 09:31:55 crc kubenswrapper[4841]: E0313 09:31:55.464205 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="900f1d10-4a56-461f-81cc-caea5f1b88c8" containerName="dnsmasq-dns" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.464220 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="900f1d10-4a56-461f-81cc-caea5f1b88c8" containerName="dnsmasq-dns" Mar 13 09:31:55 crc kubenswrapper[4841]: E0313 09:31:55.464241 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e627e4b-0a39-42af-be45-147ff230fd13" containerName="neutron-db-sync" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.464250 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e627e4b-0a39-42af-be45-147ff230fd13" containerName="neutron-db-sync" Mar 13 09:31:55 crc kubenswrapper[4841]: E0313 09:31:55.464321 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="900f1d10-4a56-461f-81cc-caea5f1b88c8" containerName="init" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.464331 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="900f1d10-4a56-461f-81cc-caea5f1b88c8" containerName="init" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.464515 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="900f1d10-4a56-461f-81cc-caea5f1b88c8" containerName="dnsmasq-dns" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.464540 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e627e4b-0a39-42af-be45-147ff230fd13" containerName="neutron-db-sync" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.465473 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.472966 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-9svmt"] Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.543828 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5864fb6bd-dr6nw"] Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.545208 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.548848 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-75sb5" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.548956 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.554069 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.554239 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.558125 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5864fb6bd-dr6nw"] Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.574652 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-9svmt\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.574697 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-config\") pod \"dnsmasq-dns-55f844cf75-9svmt\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.574946 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-dns-svc\") pod \"dnsmasq-dns-55f844cf75-9svmt\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.574979 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-9svmt\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.574997 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msmf4\" (UniqueName: \"kubernetes.io/projected/f24c7226-6207-4858-8369-e1496280d721-kube-api-access-msmf4\") pod \"dnsmasq-dns-55f844cf75-9svmt\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.575046 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-9svmt\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.676712 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-combined-ca-bundle\") pod \"neutron-5864fb6bd-dr6nw\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.677076 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-config\") pod \"neutron-5864fb6bd-dr6nw\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.677126 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-9svmt\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.677160 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-config\") pod \"dnsmasq-dns-55f844cf75-9svmt\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.677184 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jndnp\" (UniqueName: \"kubernetes.io/projected/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-kube-api-access-jndnp\") pod \"neutron-5864fb6bd-dr6nw\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.677254 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-httpd-config\") pod \"neutron-5864fb6bd-dr6nw\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.677317 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-dns-svc\") pod \"dnsmasq-dns-55f844cf75-9svmt\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.677343 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-9svmt\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.677366 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msmf4\" (UniqueName: \"kubernetes.io/projected/f24c7226-6207-4858-8369-e1496280d721-kube-api-access-msmf4\") pod \"dnsmasq-dns-55f844cf75-9svmt\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.677453 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-9svmt\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.677498 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-ovndb-tls-certs\") pod \"neutron-5864fb6bd-dr6nw\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.678105 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-9svmt\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.678110 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-9svmt\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.678248 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-config\") pod \"dnsmasq-dns-55f844cf75-9svmt\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.678863 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-dns-svc\") pod \"dnsmasq-dns-55f844cf75-9svmt\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.678894 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-9svmt\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.695532 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msmf4\" (UniqueName: \"kubernetes.io/projected/f24c7226-6207-4858-8369-e1496280d721-kube-api-access-msmf4\") pod \"dnsmasq-dns-55f844cf75-9svmt\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.778498 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-ovndb-tls-certs\") pod \"neutron-5864fb6bd-dr6nw\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.778539 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-combined-ca-bundle\") pod \"neutron-5864fb6bd-dr6nw\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.778559 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-config\") pod \"neutron-5864fb6bd-dr6nw\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.778598 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jndnp\" (UniqueName: \"kubernetes.io/projected/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-kube-api-access-jndnp\") pod \"neutron-5864fb6bd-dr6nw\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.778653 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-httpd-config\") pod \"neutron-5864fb6bd-dr6nw\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.782966 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-httpd-config\") pod \"neutron-5864fb6bd-dr6nw\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.783625 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-ovndb-tls-certs\") pod \"neutron-5864fb6bd-dr6nw\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.785953 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-config\") pod \"neutron-5864fb6bd-dr6nw\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.791914 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-combined-ca-bundle\") pod \"neutron-5864fb6bd-dr6nw\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.796741 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jndnp\" (UniqueName: \"kubernetes.io/projected/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-kube-api-access-jndnp\") pod \"neutron-5864fb6bd-dr6nw\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.799508 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:55 crc kubenswrapper[4841]: I0313 09:31:55.899940 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:31:55 crc kubenswrapper[4841]: E0313 09:31:55.922567 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-9nnr7" podUID="5d602e6f-77e5-4496-b426-2c003dad63e4" Mar 13 09:31:56 crc kubenswrapper[4841]: I0313 09:31:56.005458 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="900f1d10-4a56-461f-81cc-caea5f1b88c8" path="/var/lib/kubelet/pods/900f1d10-4a56-461f-81cc-caea5f1b88c8/volumes" Mar 13 09:31:56 crc kubenswrapper[4841]: I0313 09:31:56.033330 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 09:31:56 crc kubenswrapper[4841]: I0313 09:31:56.153442 4841 scope.go:117] "RemoveContainer" containerID="af4045b148d6e0d91645e878fe95ff87e1b0e01937703e0579c6bb0d618dc613" Mar 13 09:31:56 crc kubenswrapper[4841]: I0313 09:31:56.159010 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:31:56 crc kubenswrapper[4841]: I0313 09:31:56.216019 4841 scope.go:117] "RemoveContainer" containerID="cee69fe8c28d2b81682e9ba0d48495a9116e0d5a3d1c6c11623b5e353e338ddb" Mar 13 09:31:56 crc kubenswrapper[4841]: I0313 09:31:56.268512 4841 scope.go:117] "RemoveContainer" containerID="0cfaa38915f28a779a65cadb53ed4cc3bfa3be02817bb08c8e2a4adf625aee83" Mar 13 09:31:56 crc kubenswrapper[4841]: I0313 09:31:56.508632 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ppjwg"] Mar 13 09:31:56 crc kubenswrapper[4841]: W0313 09:31:56.517065 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a64aff9_f7c3_48cb_8c68_3d0b2208a53e.slice/crio-5e1db8e902a60f3499bc093e406cf8546da3236a01b0b78a8a7e2af76024d98a WatchSource:0}: Error finding container 5e1db8e902a60f3499bc093e406cf8546da3236a01b0b78a8a7e2af76024d98a: Status 404 returned error can't find the container with id 5e1db8e902a60f3499bc093e406cf8546da3236a01b0b78a8a7e2af76024d98a Mar 13 09:31:56 crc kubenswrapper[4841]: I0313 09:31:56.617308 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-9svmt"] Mar 13 09:31:56 crc kubenswrapper[4841]: I0313 09:31:56.743950 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5864fb6bd-dr6nw"] Mar 13 09:31:56 crc kubenswrapper[4841]: W0313 09:31:56.784406 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf5e75f4_273c_49c5_afda_9483dfcf1ff3.slice/crio-b0aaf2082044be834887166936aece66a364f6b23ad67cc98420425062fd1e90 WatchSource:0}: Error finding container b0aaf2082044be834887166936aece66a364f6b23ad67cc98420425062fd1e90: Status 404 returned error can't find the container with id b0aaf2082044be834887166936aece66a364f6b23ad67cc98420425062fd1e90 Mar 13 09:31:56 crc kubenswrapper[4841]: I0313 09:31:56.945244 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ppjwg" event={"ID":"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e","Type":"ContainerStarted","Data":"4603393b1be8277f434e2375f8309099e7b56ec64f63b0d5b6afa699ed810ff5"} Mar 13 09:31:56 crc kubenswrapper[4841]: I0313 09:31:56.945311 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ppjwg" event={"ID":"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e","Type":"ContainerStarted","Data":"5e1db8e902a60f3499bc093e406cf8546da3236a01b0b78a8a7e2af76024d98a"} Mar 13 09:31:56 crc kubenswrapper[4841]: I0313 09:31:56.979075 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ef6730be-d6cf-42b1-b356-fa08748e42ef","Type":"ContainerStarted","Data":"275545c67d37e760b7e7d7af8e82b47daf54424a8ce5cef415bd80e89f140824"} Mar 13 09:31:56 crc kubenswrapper[4841]: I0313 09:31:56.987857 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-2g4hc" podUID="900f1d10-4a56-461f-81cc-caea5f1b88c8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: i/o timeout" Mar 13 09:31:56 crc kubenswrapper[4841]: I0313 09:31:56.989980 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"300a1817-5e08-4edc-afec-829b69b0e7e9","Type":"ContainerStarted","Data":"9e1330bfb437d4bbd0e86a26fb947279cb2d7c6596401c64b436c137ec452db7"} Mar 13 09:31:56 crc kubenswrapper[4841]: I0313 09:31:56.999388 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ppjwg" podStartSLOduration=10.999370154 podStartE2EDuration="10.999370154s" podCreationTimestamp="2026-03-13 09:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:31:56.97896642 +0000 UTC m=+1199.708866631" watchObservedRunningTime="2026-03-13 09:31:56.999370154 +0000 UTC m=+1199.729270345" Mar 13 09:31:57 crc kubenswrapper[4841]: I0313 09:31:57.043096 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5864fb6bd-dr6nw" event={"ID":"cf5e75f4-273c-49c5-afda-9483dfcf1ff3","Type":"ContainerStarted","Data":"b0aaf2082044be834887166936aece66a364f6b23ad67cc98420425062fd1e90"} Mar 13 09:31:57 crc kubenswrapper[4841]: I0313 09:31:57.066478 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-9svmt" event={"ID":"f24c7226-6207-4858-8369-e1496280d721","Type":"ContainerStarted","Data":"5ed1e5018d8c41ac41624d4bd87f984ce55976c1f718baea73d84c1c6b2ae316"} Mar 13 09:31:57 crc kubenswrapper[4841]: I0313 09:31:57.089440 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72841340-b4e1-4283-8eb0-10641fb61f62","Type":"ContainerStarted","Data":"985af6398752095c1c7e6a1c79b4cc708bcc3b0865cdd0fa2b70a0fc724088d2"} Mar 13 09:31:57 crc kubenswrapper[4841]: I0313 09:31:57.105093 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-phwrx" event={"ID":"cb20acbd-2346-46cd-baba-089a6afed51b","Type":"ContainerStarted","Data":"b8eac2a4b576d9cd640a3e082ae0e384068b44355945524f80c4718c5282858f"} Mar 13 09:31:57 crc kubenswrapper[4841]: I0313 09:31:57.157154 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-phwrx" podStartSLOduration=3.305873738 podStartE2EDuration="29.157132987s" podCreationTimestamp="2026-03-13 09:31:28 +0000 UTC" firstStartedPulling="2026-03-13 09:31:29.478411289 +0000 UTC m=+1172.208311480" lastFinishedPulling="2026-03-13 09:31:55.329670538 +0000 UTC m=+1198.059570729" observedRunningTime="2026-03-13 09:31:57.151960429 +0000 UTC m=+1199.881860620" watchObservedRunningTime="2026-03-13 09:31:57.157132987 +0000 UTC m=+1199.887033178" Mar 13 09:31:57 crc kubenswrapper[4841]: I0313 09:31:57.739253 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-798495c9df-7c5cf"] Mar 13 09:31:57 crc kubenswrapper[4841]: I0313 09:31:57.744370 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:57 crc kubenswrapper[4841]: I0313 09:31:57.753397 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 13 09:31:57 crc kubenswrapper[4841]: I0313 09:31:57.754031 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 13 09:31:57 crc kubenswrapper[4841]: I0313 09:31:57.765474 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-798495c9df-7c5cf"] Mar 13 09:31:57 crc kubenswrapper[4841]: I0313 09:31:57.932319 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-config\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:57 crc kubenswrapper[4841]: I0313 09:31:57.932391 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-public-tls-certs\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:57 crc kubenswrapper[4841]: I0313 09:31:57.932432 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-ovndb-tls-certs\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:57 crc kubenswrapper[4841]: I0313 09:31:57.932500 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-combined-ca-bundle\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:57 crc kubenswrapper[4841]: I0313 09:31:57.932655 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-httpd-config\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:57 crc kubenswrapper[4841]: I0313 09:31:57.932792 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-internal-tls-certs\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:57 crc kubenswrapper[4841]: I0313 09:31:57.932880 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sbg2\" (UniqueName: \"kubernetes.io/projected/2a0f5823-2dff-4614-974e-7ebdc083a570-kube-api-access-9sbg2\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.034867 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-internal-tls-certs\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.034957 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sbg2\" (UniqueName: \"kubernetes.io/projected/2a0f5823-2dff-4614-974e-7ebdc083a570-kube-api-access-9sbg2\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.035058 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-config\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.035090 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-public-tls-certs\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.035143 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-ovndb-tls-certs\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.035194 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-combined-ca-bundle\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.035224 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-httpd-config\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.052153 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-httpd-config\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.054135 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.056815 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.057088 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-combined-ca-bundle\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.060833 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-ovndb-tls-certs\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.070279 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-config\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.086958 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sbg2\" (UniqueName: \"kubernetes.io/projected/2a0f5823-2dff-4614-974e-7ebdc083a570-kube-api-access-9sbg2\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.087505 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-internal-tls-certs\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.091593 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-public-tls-certs\") pod \"neutron-798495c9df-7c5cf\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.113060 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.136522 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ef6730be-d6cf-42b1-b356-fa08748e42ef","Type":"ContainerStarted","Data":"390f7c8080013656a2d8669a107436cce0d8b4d4ef8b4bcf595508ac814735b2"} Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.136568 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ef6730be-d6cf-42b1-b356-fa08748e42ef","Type":"ContainerStarted","Data":"08707d05ffe742c3f5c6c21e5f25a34d24910becd2d02ad5be4eab681eb31658"} Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.154944 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"300a1817-5e08-4edc-afec-829b69b0e7e9","Type":"ContainerStarted","Data":"54dfd791df3aa8deab0251901ef093886d75e89f47a1e6d147dc35492a1070f9"} Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.155350 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"300a1817-5e08-4edc-afec-829b69b0e7e9","Type":"ContainerStarted","Data":"2255d926cf9493c05b71b77ba5a003c184fe7a6d72800a5472d5805356137e69"} Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.169724 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.169764 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.178245 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=19.178217047 podStartE2EDuration="19.178217047s" podCreationTimestamp="2026-03-13 09:31:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:31:58.168241683 +0000 UTC m=+1200.898141884" watchObservedRunningTime="2026-03-13 09:31:58.178217047 +0000 UTC m=+1200.908117238" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.181172 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5864fb6bd-dr6nw" event={"ID":"cf5e75f4-273c-49c5-afda-9483dfcf1ff3","Type":"ContainerStarted","Data":"0fcd783f0b79c7ed83ed485507c30a3aaeeb91acfbc08b832607177884e12c31"} Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.181210 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5864fb6bd-dr6nw" event={"ID":"cf5e75f4-273c-49c5-afda-9483dfcf1ff3","Type":"ContainerStarted","Data":"529a9c9ded573845cf459d9f5c8b035f9c2483362f12bb2aac07150f813ae2f1"} Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.182038 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.225606 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=21.225589235 podStartE2EDuration="21.225589235s" podCreationTimestamp="2026-03-13 09:31:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:31:58.217744596 +0000 UTC m=+1200.947644787" watchObservedRunningTime="2026-03-13 09:31:58.225589235 +0000 UTC m=+1200.955489426" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.229603 4841 generic.go:334] "Generic (PLEG): container finished" podID="f24c7226-6207-4858-8369-e1496280d721" containerID="9eb34aa0106d0f8b6c5ef4b6d9becaaaf41a863a9f963b82bde7b49296235814" exitCode=0 Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.229714 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-9svmt" event={"ID":"f24c7226-6207-4858-8369-e1496280d721","Type":"ContainerDied","Data":"9eb34aa0106d0f8b6c5ef4b6d9becaaaf41a863a9f963b82bde7b49296235814"} Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.281945 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5864fb6bd-dr6nw" podStartSLOduration=3.281921087 podStartE2EDuration="3.281921087s" podCreationTimestamp="2026-03-13 09:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:31:58.252844558 +0000 UTC m=+1200.982744769" watchObservedRunningTime="2026-03-13 09:31:58.281921087 +0000 UTC m=+1201.011821278" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.290256 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.367437 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:58 crc kubenswrapper[4841]: I0313 09:31:58.793783 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-798495c9df-7c5cf"] Mar 13 09:31:59 crc kubenswrapper[4841]: I0313 09:31:59.262510 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-9svmt" event={"ID":"f24c7226-6207-4858-8369-e1496280d721","Type":"ContainerStarted","Data":"a568b41d415d7d2d78a4d3d15e9ece9c4cbf418823c37217498923afa94cb0d3"} Mar 13 09:31:59 crc kubenswrapper[4841]: I0313 09:31:59.262826 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:31:59 crc kubenswrapper[4841]: I0313 09:31:59.270186 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-798495c9df-7c5cf" event={"ID":"2a0f5823-2dff-4614-974e-7ebdc083a570","Type":"ContainerStarted","Data":"6b7b3f5d819e23d5151c75c043b419ac10c089047b3e246534c2b6c68881ef5a"} Mar 13 09:31:59 crc kubenswrapper[4841]: I0313 09:31:59.270228 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-798495c9df-7c5cf" event={"ID":"2a0f5823-2dff-4614-974e-7ebdc083a570","Type":"ContainerStarted","Data":"52441eb6173774442e65a08ceeeb90ba11936b4e9ddda6a2fb0eb4b9f97a4ff7"} Mar 13 09:31:59 crc kubenswrapper[4841]: I0313 09:31:59.272520 4841 generic.go:334] "Generic (PLEG): container finished" podID="cb20acbd-2346-46cd-baba-089a6afed51b" containerID="b8eac2a4b576d9cd640a3e082ae0e384068b44355945524f80c4718c5282858f" exitCode=0 Mar 13 09:31:59 crc kubenswrapper[4841]: I0313 09:31:59.272616 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-phwrx" event={"ID":"cb20acbd-2346-46cd-baba-089a6afed51b","Type":"ContainerDied","Data":"b8eac2a4b576d9cd640a3e082ae0e384068b44355945524f80c4718c5282858f"} Mar 13 09:31:59 crc kubenswrapper[4841]: I0313 09:31:59.273475 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:59 crc kubenswrapper[4841]: I0313 09:31:59.273502 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 09:31:59 crc kubenswrapper[4841]: I0313 09:31:59.280521 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-9svmt" podStartSLOduration=4.28050626 podStartE2EDuration="4.28050626s" podCreationTimestamp="2026-03-13 09:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:31:59.278965903 +0000 UTC m=+1202.008866094" watchObservedRunningTime="2026-03-13 09:31:59.28050626 +0000 UTC m=+1202.010406451" Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.136616 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556572-j79gt"] Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.137976 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556572-j79gt" Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.142209 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.142385 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.142401 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.144218 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556572-j79gt"] Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.185724 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz4sh\" (UniqueName: \"kubernetes.io/projected/b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b-kube-api-access-pz4sh\") pod \"auto-csr-approver-29556572-j79gt\" (UID: \"b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b\") " pod="openshift-infra/auto-csr-approver-29556572-j79gt" Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.226975 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.227032 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.252974 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.270248 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.284033 4841 generic.go:334] "Generic (PLEG): container finished" podID="0a64aff9-f7c3-48cb-8c68-3d0b2208a53e" containerID="4603393b1be8277f434e2375f8309099e7b56ec64f63b0d5b6afa699ed810ff5" exitCode=0 Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.284176 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ppjwg" event={"ID":"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e","Type":"ContainerDied","Data":"4603393b1be8277f434e2375f8309099e7b56ec64f63b0d5b6afa699ed810ff5"} Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.286993 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz4sh\" (UniqueName: \"kubernetes.io/projected/b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b-kube-api-access-pz4sh\") pod \"auto-csr-approver-29556572-j79gt\" (UID: \"b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b\") " pod="openshift-infra/auto-csr-approver-29556572-j79gt" Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.287054 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-798495c9df-7c5cf" event={"ID":"2a0f5823-2dff-4614-974e-7ebdc083a570","Type":"ContainerStarted","Data":"d99831d4df233f3a2e7ac3ef1daebd6d22819dd557782d994774e5f87b3469ff"} Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.287947 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.288777 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.288794 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.318550 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz4sh\" (UniqueName: \"kubernetes.io/projected/b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b-kube-api-access-pz4sh\") pod \"auto-csr-approver-29556572-j79gt\" (UID: \"b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b\") " pod="openshift-infra/auto-csr-approver-29556572-j79gt" Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.330707 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-798495c9df-7c5cf" podStartSLOduration=3.33068424 podStartE2EDuration="3.33068424s" podCreationTimestamp="2026-03-13 09:31:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:32:00.318477328 +0000 UTC m=+1203.048377519" watchObservedRunningTime="2026-03-13 09:32:00.33068424 +0000 UTC m=+1203.060584431" Mar 13 09:32:00 crc kubenswrapper[4841]: I0313 09:32:00.464414 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556572-j79gt" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.200676 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-phwrx" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.294683 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-phwrx" event={"ID":"cb20acbd-2346-46cd-baba-089a6afed51b","Type":"ContainerDied","Data":"55d061398e701f686edf01ee875511b4193db71c211fa764da073dddd6514f35"} Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.294708 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-phwrx" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.294725 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55d061398e701f686edf01ee875511b4193db71c211fa764da073dddd6514f35" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.323671 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb20acbd-2346-46cd-baba-089a6afed51b-combined-ca-bundle\") pod \"cb20acbd-2346-46cd-baba-089a6afed51b\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.323905 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb20acbd-2346-46cd-baba-089a6afed51b-config-data\") pod \"cb20acbd-2346-46cd-baba-089a6afed51b\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.323968 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb20acbd-2346-46cd-baba-089a6afed51b-scripts\") pod \"cb20acbd-2346-46cd-baba-089a6afed51b\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.324008 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb20acbd-2346-46cd-baba-089a6afed51b-logs\") pod \"cb20acbd-2346-46cd-baba-089a6afed51b\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.324054 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzqbd\" (UniqueName: \"kubernetes.io/projected/cb20acbd-2346-46cd-baba-089a6afed51b-kube-api-access-tzqbd\") pod \"cb20acbd-2346-46cd-baba-089a6afed51b\" (UID: \"cb20acbd-2346-46cd-baba-089a6afed51b\") " Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.327531 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb20acbd-2346-46cd-baba-089a6afed51b-logs" (OuterVolumeSpecName: "logs") pod "cb20acbd-2346-46cd-baba-089a6afed51b" (UID: "cb20acbd-2346-46cd-baba-089a6afed51b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.334215 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb20acbd-2346-46cd-baba-089a6afed51b-kube-api-access-tzqbd" (OuterVolumeSpecName: "kube-api-access-tzqbd") pod "cb20acbd-2346-46cd-baba-089a6afed51b" (UID: "cb20acbd-2346-46cd-baba-089a6afed51b"). InnerVolumeSpecName "kube-api-access-tzqbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.342505 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb20acbd-2346-46cd-baba-089a6afed51b-scripts" (OuterVolumeSpecName: "scripts") pod "cb20acbd-2346-46cd-baba-089a6afed51b" (UID: "cb20acbd-2346-46cd-baba-089a6afed51b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.377515 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb20acbd-2346-46cd-baba-089a6afed51b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb20acbd-2346-46cd-baba-089a6afed51b" (UID: "cb20acbd-2346-46cd-baba-089a6afed51b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.403291 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6969d7c4d8-xrfbc"] Mar 13 09:32:01 crc kubenswrapper[4841]: E0313 09:32:01.403973 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb20acbd-2346-46cd-baba-089a6afed51b" containerName="placement-db-sync" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.403994 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb20acbd-2346-46cd-baba-089a6afed51b" containerName="placement-db-sync" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.404158 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb20acbd-2346-46cd-baba-089a6afed51b" containerName="placement-db-sync" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.405077 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.411251 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.411486 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.452883 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb20acbd-2346-46cd-baba-089a6afed51b-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.452936 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb20acbd-2346-46cd-baba-089a6afed51b-logs\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.452950 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzqbd\" (UniqueName: \"kubernetes.io/projected/cb20acbd-2346-46cd-baba-089a6afed51b-kube-api-access-tzqbd\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.452964 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb20acbd-2346-46cd-baba-089a6afed51b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.454636 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6969d7c4d8-xrfbc"] Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.460415 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb20acbd-2346-46cd-baba-089a6afed51b-config-data" (OuterVolumeSpecName: "config-data") pod "cb20acbd-2346-46cd-baba-089a6afed51b" (UID: "cb20acbd-2346-46cd-baba-089a6afed51b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.562133 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b97q7\" (UniqueName: \"kubernetes.io/projected/eb74819f-9ae9-498b-88b5-f0fcaf598409-kube-api-access-b97q7\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.562178 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-internal-tls-certs\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.562244 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb74819f-9ae9-498b-88b5-f0fcaf598409-logs\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.562292 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-public-tls-certs\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.562339 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-combined-ca-bundle\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.562407 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-scripts\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.562434 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-config-data\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.562483 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb20acbd-2346-46cd-baba-089a6afed51b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.663561 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-public-tls-certs\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.663897 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-combined-ca-bundle\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.663965 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-scripts\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.663997 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-config-data\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.664033 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b97q7\" (UniqueName: \"kubernetes.io/projected/eb74819f-9ae9-498b-88b5-f0fcaf598409-kube-api-access-b97q7\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.664051 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-internal-tls-certs\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.664075 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb74819f-9ae9-498b-88b5-f0fcaf598409-logs\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.664473 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb74819f-9ae9-498b-88b5-f0fcaf598409-logs\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.669429 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-config-data\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.669698 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-scripts\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.671408 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-internal-tls-certs\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.672415 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-combined-ca-bundle\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.679285 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b97q7\" (UniqueName: \"kubernetes.io/projected/eb74819f-9ae9-498b-88b5-f0fcaf598409-kube-api-access-b97q7\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.680381 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-public-tls-certs\") pod \"placement-6969d7c4d8-xrfbc\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:01 crc kubenswrapper[4841]: I0313 09:32:01.779642 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.313329 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ppjwg" event={"ID":"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e","Type":"ContainerDied","Data":"5e1db8e902a60f3499bc093e406cf8546da3236a01b0b78a8a7e2af76024d98a"} Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.314014 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e1db8e902a60f3499bc093e406cf8546da3236a01b0b78a8a7e2af76024d98a" Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.440418 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.494163 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-credential-keys\") pod \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.494464 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdnsw\" (UniqueName: \"kubernetes.io/projected/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-kube-api-access-mdnsw\") pod \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.494506 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-fernet-keys\") pod \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.494547 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-combined-ca-bundle\") pod \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.494635 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-config-data\") pod \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.494718 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-scripts\") pod \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\" (UID: \"0a64aff9-f7c3-48cb-8c68-3d0b2208a53e\") " Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.500770 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-kube-api-access-mdnsw" (OuterVolumeSpecName: "kube-api-access-mdnsw") pod "0a64aff9-f7c3-48cb-8c68-3d0b2208a53e" (UID: "0a64aff9-f7c3-48cb-8c68-3d0b2208a53e"). InnerVolumeSpecName "kube-api-access-mdnsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.504441 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0a64aff9-f7c3-48cb-8c68-3d0b2208a53e" (UID: "0a64aff9-f7c3-48cb-8c68-3d0b2208a53e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.509662 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0a64aff9-f7c3-48cb-8c68-3d0b2208a53e" (UID: "0a64aff9-f7c3-48cb-8c68-3d0b2208a53e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.527427 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-scripts" (OuterVolumeSpecName: "scripts") pod "0a64aff9-f7c3-48cb-8c68-3d0b2208a53e" (UID: "0a64aff9-f7c3-48cb-8c68-3d0b2208a53e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.564831 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-config-data" (OuterVolumeSpecName: "config-data") pod "0a64aff9-f7c3-48cb-8c68-3d0b2208a53e" (UID: "0a64aff9-f7c3-48cb-8c68-3d0b2208a53e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.596216 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.596244 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.596255 4841 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.596324 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdnsw\" (UniqueName: \"kubernetes.io/projected/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-kube-api-access-mdnsw\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.596337 4841 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.597004 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a64aff9-f7c3-48cb-8c68-3d0b2208a53e" (UID: "0a64aff9-f7c3-48cb-8c68-3d0b2208a53e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.634639 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.697102 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.760917 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556572-j79gt"] Mar 13 09:32:03 crc kubenswrapper[4841]: W0313 09:32:03.763160 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb49b5f8c_aca3_4a6f_9edc_cb8485f40c9b.slice/crio-5041e715ccc59149a93bc1699071f4e6754425ac69347e57701c6362f0a7a120 WatchSource:0}: Error finding container 5041e715ccc59149a93bc1699071f4e6754425ac69347e57701c6362f0a7a120: Status 404 returned error can't find the container with id 5041e715ccc59149a93bc1699071f4e6754425ac69347e57701c6362f0a7a120 Mar 13 09:32:03 crc kubenswrapper[4841]: W0313 09:32:03.824606 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb74819f_9ae9_498b_88b5_f0fcaf598409.slice/crio-4d30927176cd1ac75a6dcb29675247f84ec9e1c6bcd222054d8bcce8eef7482e WatchSource:0}: Error finding container 4d30927176cd1ac75a6dcb29675247f84ec9e1c6bcd222054d8bcce8eef7482e: Status 404 returned error can't find the container with id 4d30927176cd1ac75a6dcb29675247f84ec9e1c6bcd222054d8bcce8eef7482e Mar 13 09:32:03 crc kubenswrapper[4841]: I0313 09:32:03.827341 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6969d7c4d8-xrfbc"] Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.323799 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556572-j79gt" event={"ID":"b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b","Type":"ContainerStarted","Data":"5041e715ccc59149a93bc1699071f4e6754425ac69347e57701c6362f0a7a120"} Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.333363 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gg7dx" event={"ID":"9132ca8c-f2de-4025-8462-4899276a8678","Type":"ContainerStarted","Data":"a17e09c453292655b494b3d66f0eac640aa78ef8872bc81089c5f45e55af8957"} Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.339427 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72841340-b4e1-4283-8eb0-10641fb61f62","Type":"ContainerStarted","Data":"76d49f0bc8f91afba0952b95da61a6d529c69e7e2cf226ea47c6cfd3b2ff22fc"} Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.341590 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ppjwg" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.342745 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6969d7c4d8-xrfbc" event={"ID":"eb74819f-9ae9-498b-88b5-f0fcaf598409","Type":"ContainerStarted","Data":"a6090fccb088678bc444a2049865222652ab3da7d036cb10796ce5e518630144"} Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.342813 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6969d7c4d8-xrfbc" event={"ID":"eb74819f-9ae9-498b-88b5-f0fcaf598409","Type":"ContainerStarted","Data":"4d30927176cd1ac75a6dcb29675247f84ec9e1c6bcd222054d8bcce8eef7482e"} Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.350822 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-gg7dx" podStartSLOduration=2.408615661 podStartE2EDuration="36.350807351s" podCreationTimestamp="2026-03-13 09:31:28 +0000 UTC" firstStartedPulling="2026-03-13 09:31:29.339782981 +0000 UTC m=+1172.069683172" lastFinishedPulling="2026-03-13 09:32:03.281974661 +0000 UTC m=+1206.011874862" observedRunningTime="2026-03-13 09:32:04.349599335 +0000 UTC m=+1207.079499576" watchObservedRunningTime="2026-03-13 09:32:04.350807351 +0000 UTC m=+1207.080707542" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.406089 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.406938 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.407007 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.627933 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7b87d5dbb8-7ppv5"] Mar 13 09:32:04 crc kubenswrapper[4841]: E0313 09:32:04.628632 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a64aff9-f7c3-48cb-8c68-3d0b2208a53e" containerName="keystone-bootstrap" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.628653 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a64aff9-f7c3-48cb-8c68-3d0b2208a53e" containerName="keystone-bootstrap" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.628861 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a64aff9-f7c3-48cb-8c68-3d0b2208a53e" containerName="keystone-bootstrap" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.629484 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.631364 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.631612 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.631655 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-88dmb" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.634014 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.634177 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.634605 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.645946 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7b87d5dbb8-7ppv5"] Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.776206 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.823431 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scl9v\" (UniqueName: \"kubernetes.io/projected/9ef13028-1aeb-4a08-b241-fa033413b353-kube-api-access-scl9v\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.823508 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-credential-keys\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.823544 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-combined-ca-bundle\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.823572 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-fernet-keys\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.823592 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-public-tls-certs\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.823609 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-internal-tls-certs\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.823635 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-config-data\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.823672 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-scripts\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.924898 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scl9v\" (UniqueName: \"kubernetes.io/projected/9ef13028-1aeb-4a08-b241-fa033413b353-kube-api-access-scl9v\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.925472 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-credential-keys\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.925519 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-combined-ca-bundle\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.925547 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-fernet-keys\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.925567 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-public-tls-certs\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.925586 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-internal-tls-certs\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.925614 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-config-data\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.925654 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-scripts\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.932924 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-public-tls-certs\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.936960 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-credential-keys\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.937877 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-config-data\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.938100 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-scripts\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.939967 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-combined-ca-bundle\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.941539 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-fernet-keys\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.942581 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ef13028-1aeb-4a08-b241-fa033413b353-internal-tls-certs\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.945835 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scl9v\" (UniqueName: \"kubernetes.io/projected/9ef13028-1aeb-4a08-b241-fa033413b353-kube-api-access-scl9v\") pod \"keystone-7b87d5dbb8-7ppv5\" (UID: \"9ef13028-1aeb-4a08-b241-fa033413b353\") " pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:04 crc kubenswrapper[4841]: I0313 09:32:04.949439 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:05 crc kubenswrapper[4841]: I0313 09:32:05.359526 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6969d7c4d8-xrfbc" event={"ID":"eb74819f-9ae9-498b-88b5-f0fcaf598409","Type":"ContainerStarted","Data":"7b7d1f6bad0b269af33da1c7ffa7a0ede51c42245e8ecce135496e72107d0e3e"} Mar 13 09:32:05 crc kubenswrapper[4841]: I0313 09:32:05.387452 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6969d7c4d8-xrfbc" podStartSLOduration=4.387433227 podStartE2EDuration="4.387433227s" podCreationTimestamp="2026-03-13 09:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:32:05.381203967 +0000 UTC m=+1208.111104178" watchObservedRunningTime="2026-03-13 09:32:05.387433227 +0000 UTC m=+1208.117333418" Mar 13 09:32:05 crc kubenswrapper[4841]: I0313 09:32:05.496089 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7b87d5dbb8-7ppv5"] Mar 13 09:32:05 crc kubenswrapper[4841]: I0313 09:32:05.802331 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:32:05 crc kubenswrapper[4841]: I0313 09:32:05.857872 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pzf48"] Mar 13 09:32:05 crc kubenswrapper[4841]: I0313 09:32:05.858102 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" podUID="489aeb87-1810-4eab-adbb-a0047e598344" containerName="dnsmasq-dns" containerID="cri-o://a2361b6a34a9cfff36a07843d7a39d655ab2abe7d7c2668e0a687c1216842c60" gracePeriod=10 Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.412968 4841 generic.go:334] "Generic (PLEG): container finished" podID="489aeb87-1810-4eab-adbb-a0047e598344" containerID="a2361b6a34a9cfff36a07843d7a39d655ab2abe7d7c2668e0a687c1216842c60" exitCode=0 Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.413033 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" event={"ID":"489aeb87-1810-4eab-adbb-a0047e598344","Type":"ContainerDied","Data":"a2361b6a34a9cfff36a07843d7a39d655ab2abe7d7c2668e0a687c1216842c60"} Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.423676 4841 generic.go:334] "Generic (PLEG): container finished" podID="b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b" containerID="66bcc31aa9c1d0a7b9b151675fa339171f26d1c7fb743676512bbe50a5373c3f" exitCode=0 Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.424011 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556572-j79gt" event={"ID":"b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b","Type":"ContainerDied","Data":"66bcc31aa9c1d0a7b9b151675fa339171f26d1c7fb743676512bbe50a5373c3f"} Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.427759 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b87d5dbb8-7ppv5" event={"ID":"9ef13028-1aeb-4a08-b241-fa033413b353","Type":"ContainerStarted","Data":"76bd68abf14d0a1e094a8a0644b6574475a108cfa9b0bf7895d6e89771367ef6"} Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.427790 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b87d5dbb8-7ppv5" event={"ID":"9ef13028-1aeb-4a08-b241-fa033413b353","Type":"ContainerStarted","Data":"3001a71946af09ebcaa2447f8650cab617481fe8ad042610923d89017c65b616"} Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.428034 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.428077 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.466132 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7b87d5dbb8-7ppv5" podStartSLOduration=2.4661155790000002 podStartE2EDuration="2.466115579s" podCreationTimestamp="2026-03-13 09:32:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:32:06.455624147 +0000 UTC m=+1209.185524338" watchObservedRunningTime="2026-03-13 09:32:06.466115579 +0000 UTC m=+1209.196015770" Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.559993 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.667477 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-dns-swift-storage-0\") pod \"489aeb87-1810-4eab-adbb-a0047e598344\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.667624 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv56c\" (UniqueName: \"kubernetes.io/projected/489aeb87-1810-4eab-adbb-a0047e598344-kube-api-access-fv56c\") pod \"489aeb87-1810-4eab-adbb-a0047e598344\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.668606 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-ovsdbserver-sb\") pod \"489aeb87-1810-4eab-adbb-a0047e598344\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.668656 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-config\") pod \"489aeb87-1810-4eab-adbb-a0047e598344\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.668674 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-dns-svc\") pod \"489aeb87-1810-4eab-adbb-a0047e598344\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.668694 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-ovsdbserver-nb\") pod \"489aeb87-1810-4eab-adbb-a0047e598344\" (UID: \"489aeb87-1810-4eab-adbb-a0047e598344\") " Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.673440 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/489aeb87-1810-4eab-adbb-a0047e598344-kube-api-access-fv56c" (OuterVolumeSpecName: "kube-api-access-fv56c") pod "489aeb87-1810-4eab-adbb-a0047e598344" (UID: "489aeb87-1810-4eab-adbb-a0047e598344"). InnerVolumeSpecName "kube-api-access-fv56c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.709613 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "489aeb87-1810-4eab-adbb-a0047e598344" (UID: "489aeb87-1810-4eab-adbb-a0047e598344"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.721632 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "489aeb87-1810-4eab-adbb-a0047e598344" (UID: "489aeb87-1810-4eab-adbb-a0047e598344"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.732769 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "489aeb87-1810-4eab-adbb-a0047e598344" (UID: "489aeb87-1810-4eab-adbb-a0047e598344"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.739891 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-config" (OuterVolumeSpecName: "config") pod "489aeb87-1810-4eab-adbb-a0047e598344" (UID: "489aeb87-1810-4eab-adbb-a0047e598344"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.753819 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "489aeb87-1810-4eab-adbb-a0047e598344" (UID: "489aeb87-1810-4eab-adbb-a0047e598344"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.770929 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.770973 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv56c\" (UniqueName: \"kubernetes.io/projected/489aeb87-1810-4eab-adbb-a0047e598344-kube-api-access-fv56c\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.770986 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.770996 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.771007 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:06 crc kubenswrapper[4841]: I0313 09:32:06.771018 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/489aeb87-1810-4eab-adbb-a0047e598344-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:07 crc kubenswrapper[4841]: I0313 09:32:07.438222 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" Mar 13 09:32:07 crc kubenswrapper[4841]: I0313 09:32:07.442322 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pzf48" event={"ID":"489aeb87-1810-4eab-adbb-a0047e598344","Type":"ContainerDied","Data":"ff0c58ea729ca5a5c91beaafe247c89a850dbe352ef191bd40bc8f7aa3facd20"} Mar 13 09:32:07 crc kubenswrapper[4841]: I0313 09:32:07.442433 4841 scope.go:117] "RemoveContainer" containerID="a2361b6a34a9cfff36a07843d7a39d655ab2abe7d7c2668e0a687c1216842c60" Mar 13 09:32:07 crc kubenswrapper[4841]: I0313 09:32:07.445000 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:07 crc kubenswrapper[4841]: I0313 09:32:07.473208 4841 scope.go:117] "RemoveContainer" containerID="4a7f6c9f38ac2e98f09d87fb8346377e5bc3202134f1fdbdc915cbd10a5b114e" Mar 13 09:32:07 crc kubenswrapper[4841]: I0313 09:32:07.478282 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pzf48"] Mar 13 09:32:07 crc kubenswrapper[4841]: I0313 09:32:07.486720 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pzf48"] Mar 13 09:32:07 crc kubenswrapper[4841]: I0313 09:32:07.760243 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556572-j79gt" Mar 13 09:32:07 crc kubenswrapper[4841]: I0313 09:32:07.891150 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz4sh\" (UniqueName: \"kubernetes.io/projected/b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b-kube-api-access-pz4sh\") pod \"b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b\" (UID: \"b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b\") " Mar 13 09:32:07 crc kubenswrapper[4841]: I0313 09:32:07.896429 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b-kube-api-access-pz4sh" (OuterVolumeSpecName: "kube-api-access-pz4sh") pod "b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b" (UID: "b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b"). InnerVolumeSpecName "kube-api-access-pz4sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:07 crc kubenswrapper[4841]: I0313 09:32:07.993389 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz4sh\" (UniqueName: \"kubernetes.io/projected/b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b-kube-api-access-pz4sh\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:08 crc kubenswrapper[4841]: I0313 09:32:08.006802 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="489aeb87-1810-4eab-adbb-a0047e598344" path="/var/lib/kubelet/pods/489aeb87-1810-4eab-adbb-a0047e598344/volumes" Mar 13 09:32:08 crc kubenswrapper[4841]: I0313 09:32:08.449083 4841 generic.go:334] "Generic (PLEG): container finished" podID="9132ca8c-f2de-4025-8462-4899276a8678" containerID="a17e09c453292655b494b3d66f0eac640aa78ef8872bc81089c5f45e55af8957" exitCode=0 Mar 13 09:32:08 crc kubenswrapper[4841]: I0313 09:32:08.449182 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gg7dx" event={"ID":"9132ca8c-f2de-4025-8462-4899276a8678","Type":"ContainerDied","Data":"a17e09c453292655b494b3d66f0eac640aa78ef8872bc81089c5f45e55af8957"} Mar 13 09:32:08 crc kubenswrapper[4841]: I0313 09:32:08.454021 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556572-j79gt" Mar 13 09:32:08 crc kubenswrapper[4841]: I0313 09:32:08.454066 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556572-j79gt" event={"ID":"b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b","Type":"ContainerDied","Data":"5041e715ccc59149a93bc1699071f4e6754425ac69347e57701c6362f0a7a120"} Mar 13 09:32:08 crc kubenswrapper[4841]: I0313 09:32:08.454087 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5041e715ccc59149a93bc1699071f4e6754425ac69347e57701c6362f0a7a120" Mar 13 09:32:08 crc kubenswrapper[4841]: I0313 09:32:08.822700 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556566-vw6lq"] Mar 13 09:32:08 crc kubenswrapper[4841]: I0313 09:32:08.829315 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556566-vw6lq"] Mar 13 09:32:10 crc kubenswrapper[4841]: I0313 09:32:10.006151 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ce7ba8a-b324-419e-92bb-f5a845a15025" path="/var/lib/kubelet/pods/0ce7ba8a-b324-419e-92bb-f5a845a15025/volumes" Mar 13 09:32:10 crc kubenswrapper[4841]: I0313 09:32:10.111702 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 09:32:10 crc kubenswrapper[4841]: I0313 09:32:10.122052 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 09:32:12 crc kubenswrapper[4841]: I0313 09:32:12.335577 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gg7dx" Mar 13 09:32:12 crc kubenswrapper[4841]: I0313 09:32:12.377568 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9132ca8c-f2de-4025-8462-4899276a8678-combined-ca-bundle\") pod \"9132ca8c-f2de-4025-8462-4899276a8678\" (UID: \"9132ca8c-f2de-4025-8462-4899276a8678\") " Mar 13 09:32:12 crc kubenswrapper[4841]: I0313 09:32:12.378008 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9132ca8c-f2de-4025-8462-4899276a8678-db-sync-config-data\") pod \"9132ca8c-f2de-4025-8462-4899276a8678\" (UID: \"9132ca8c-f2de-4025-8462-4899276a8678\") " Mar 13 09:32:12 crc kubenswrapper[4841]: I0313 09:32:12.378512 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plcp4\" (UniqueName: \"kubernetes.io/projected/9132ca8c-f2de-4025-8462-4899276a8678-kube-api-access-plcp4\") pod \"9132ca8c-f2de-4025-8462-4899276a8678\" (UID: \"9132ca8c-f2de-4025-8462-4899276a8678\") " Mar 13 09:32:12 crc kubenswrapper[4841]: I0313 09:32:12.385764 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9132ca8c-f2de-4025-8462-4899276a8678-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9132ca8c-f2de-4025-8462-4899276a8678" (UID: "9132ca8c-f2de-4025-8462-4899276a8678"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:12 crc kubenswrapper[4841]: I0313 09:32:12.385830 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9132ca8c-f2de-4025-8462-4899276a8678-kube-api-access-plcp4" (OuterVolumeSpecName: "kube-api-access-plcp4") pod "9132ca8c-f2de-4025-8462-4899276a8678" (UID: "9132ca8c-f2de-4025-8462-4899276a8678"). InnerVolumeSpecName "kube-api-access-plcp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:12 crc kubenswrapper[4841]: I0313 09:32:12.419416 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9132ca8c-f2de-4025-8462-4899276a8678-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9132ca8c-f2de-4025-8462-4899276a8678" (UID: "9132ca8c-f2de-4025-8462-4899276a8678"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:12 crc kubenswrapper[4841]: I0313 09:32:12.482815 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plcp4\" (UniqueName: \"kubernetes.io/projected/9132ca8c-f2de-4025-8462-4899276a8678-kube-api-access-plcp4\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:12 crc kubenswrapper[4841]: I0313 09:32:12.483491 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9132ca8c-f2de-4025-8462-4899276a8678-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:12 crc kubenswrapper[4841]: I0313 09:32:12.483506 4841 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9132ca8c-f2de-4025-8462-4899276a8678-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:12 crc kubenswrapper[4841]: I0313 09:32:12.492342 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gg7dx" event={"ID":"9132ca8c-f2de-4025-8462-4899276a8678","Type":"ContainerDied","Data":"849b29aa05af1d8195aad8aa53c63df15de5a1814e486acab5f9d23939c040d0"} Mar 13 09:32:12 crc kubenswrapper[4841]: I0313 09:32:12.492379 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="849b29aa05af1d8195aad8aa53c63df15de5a1814e486acab5f9d23939c040d0" Mar 13 09:32:12 crc kubenswrapper[4841]: I0313 09:32:12.492378 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gg7dx" Mar 13 09:32:13 crc kubenswrapper[4841]: E0313 09:32:13.068459 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="72841340-b4e1-4283-8eb0-10641fb61f62" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.510945 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-q6zfh" event={"ID":"2038e7ba-1de4-49b4-95dd-b2f3cde7be45","Type":"ContainerStarted","Data":"58fdcfcf849ffb97d7153cf927d93c32acae4959d41ed0b6e7563f6a919ce267"} Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.515852 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72841340-b4e1-4283-8eb0-10641fb61f62","Type":"ContainerStarted","Data":"eaacabceb3226d5435ee602e30f7679746b9e92b6cd645f9cd0bb7596f361ed7"} Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.516013 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.516018 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72841340-b4e1-4283-8eb0-10641fb61f62" containerName="proxy-httpd" containerID="cri-o://eaacabceb3226d5435ee602e30f7679746b9e92b6cd645f9cd0bb7596f361ed7" gracePeriod=30 Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.515996 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72841340-b4e1-4283-8eb0-10641fb61f62" containerName="ceilometer-notification-agent" containerID="cri-o://985af6398752095c1c7e6a1c79b4cc708bcc3b0865cdd0fa2b70a0fc724088d2" gracePeriod=30 Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.516092 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72841340-b4e1-4283-8eb0-10641fb61f62" containerName="sg-core" containerID="cri-o://76d49f0bc8f91afba0952b95da61a6d529c69e7e2cf226ea47c6cfd3b2ff22fc" gracePeriod=30 Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.536452 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-q6zfh" podStartSLOduration=2.51875054 podStartE2EDuration="46.536431152s" podCreationTimestamp="2026-03-13 09:31:27 +0000 UTC" firstStartedPulling="2026-03-13 09:31:28.851601128 +0000 UTC m=+1171.581501319" lastFinishedPulling="2026-03-13 09:32:12.86928173 +0000 UTC m=+1215.599181931" observedRunningTime="2026-03-13 09:32:13.530051667 +0000 UTC m=+1216.259951858" watchObservedRunningTime="2026-03-13 09:32:13.536431152 +0000 UTC m=+1216.266331343" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.639871 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-d94989454-4npv2"] Mar 13 09:32:13 crc kubenswrapper[4841]: E0313 09:32:13.640243 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9132ca8c-f2de-4025-8462-4899276a8678" containerName="barbican-db-sync" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.640272 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9132ca8c-f2de-4025-8462-4899276a8678" containerName="barbican-db-sync" Mar 13 09:32:13 crc kubenswrapper[4841]: E0313 09:32:13.640287 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b" containerName="oc" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.640293 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b" containerName="oc" Mar 13 09:32:13 crc kubenswrapper[4841]: E0313 09:32:13.640306 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="489aeb87-1810-4eab-adbb-a0047e598344" containerName="dnsmasq-dns" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.640312 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="489aeb87-1810-4eab-adbb-a0047e598344" containerName="dnsmasq-dns" Mar 13 09:32:13 crc kubenswrapper[4841]: E0313 09:32:13.640338 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="489aeb87-1810-4eab-adbb-a0047e598344" containerName="init" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.640344 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="489aeb87-1810-4eab-adbb-a0047e598344" containerName="init" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.640501 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9132ca8c-f2de-4025-8462-4899276a8678" containerName="barbican-db-sync" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.640556 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b" containerName="oc" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.640575 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="489aeb87-1810-4eab-adbb-a0047e598344" containerName="dnsmasq-dns" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.641478 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-d94989454-4npv2" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.648424 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.648601 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.648692 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tg6zz" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.681392 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-d94989454-4npv2"] Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.694351 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-fdcb67bff-tvnvp"] Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.695998 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-fdcb67bff-tvnvp" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.711192 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.715280 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a624af3-f727-4d7e-8b59-6c45863bfcea-config-data-custom\") pod \"barbican-keystone-listener-d94989454-4npv2\" (UID: \"0a624af3-f727-4d7e-8b59-6c45863bfcea\") " pod="openstack/barbican-keystone-listener-d94989454-4npv2" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.715324 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kddqp\" (UniqueName: \"kubernetes.io/projected/0a624af3-f727-4d7e-8b59-6c45863bfcea-kube-api-access-kddqp\") pod \"barbican-keystone-listener-d94989454-4npv2\" (UID: \"0a624af3-f727-4d7e-8b59-6c45863bfcea\") " pod="openstack/barbican-keystone-listener-d94989454-4npv2" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.715360 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a624af3-f727-4d7e-8b59-6c45863bfcea-combined-ca-bundle\") pod \"barbican-keystone-listener-d94989454-4npv2\" (UID: \"0a624af3-f727-4d7e-8b59-6c45863bfcea\") " pod="openstack/barbican-keystone-listener-d94989454-4npv2" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.715384 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a624af3-f727-4d7e-8b59-6c45863bfcea-config-data\") pod \"barbican-keystone-listener-d94989454-4npv2\" (UID: \"0a624af3-f727-4d7e-8b59-6c45863bfcea\") " pod="openstack/barbican-keystone-listener-d94989454-4npv2" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.715445 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a624af3-f727-4d7e-8b59-6c45863bfcea-logs\") pod \"barbican-keystone-listener-d94989454-4npv2\" (UID: \"0a624af3-f727-4d7e-8b59-6c45863bfcea\") " pod="openstack/barbican-keystone-listener-d94989454-4npv2" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.728648 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-vwdxx"] Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.730143 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.803583 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-fdcb67bff-tvnvp"] Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.817790 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-vwdxx\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.817847 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-config\") pod \"dnsmasq-dns-85ff748b95-vwdxx\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.817884 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pdvd\" (UniqueName: \"kubernetes.io/projected/29b494b4-557b-4469-b18f-bef8d24e73b7-kube-api-access-8pdvd\") pod \"dnsmasq-dns-85ff748b95-vwdxx\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.817903 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd-config-data\") pod \"barbican-worker-fdcb67bff-tvnvp\" (UID: \"6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd\") " pod="openstack/barbican-worker-fdcb67bff-tvnvp" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.817928 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd-logs\") pod \"barbican-worker-fdcb67bff-tvnvp\" (UID: \"6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd\") " pod="openstack/barbican-worker-fdcb67bff-tvnvp" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.817950 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a624af3-f727-4d7e-8b59-6c45863bfcea-logs\") pod \"barbican-keystone-listener-d94989454-4npv2\" (UID: \"0a624af3-f727-4d7e-8b59-6c45863bfcea\") " pod="openstack/barbican-keystone-listener-d94989454-4npv2" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.817972 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd-config-data-custom\") pod \"barbican-worker-fdcb67bff-tvnvp\" (UID: \"6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd\") " pod="openstack/barbican-worker-fdcb67bff-tvnvp" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.818015 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-vwdxx\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.818043 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a624af3-f727-4d7e-8b59-6c45863bfcea-config-data-custom\") pod \"barbican-keystone-listener-d94989454-4npv2\" (UID: \"0a624af3-f727-4d7e-8b59-6c45863bfcea\") " pod="openstack/barbican-keystone-listener-d94989454-4npv2" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.818062 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-vwdxx\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.818086 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-dns-svc\") pod \"dnsmasq-dns-85ff748b95-vwdxx\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.818103 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqrp9\" (UniqueName: \"kubernetes.io/projected/6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd-kube-api-access-pqrp9\") pod \"barbican-worker-fdcb67bff-tvnvp\" (UID: \"6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd\") " pod="openstack/barbican-worker-fdcb67bff-tvnvp" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.818119 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kddqp\" (UniqueName: \"kubernetes.io/projected/0a624af3-f727-4d7e-8b59-6c45863bfcea-kube-api-access-kddqp\") pod \"barbican-keystone-listener-d94989454-4npv2\" (UID: \"0a624af3-f727-4d7e-8b59-6c45863bfcea\") " pod="openstack/barbican-keystone-listener-d94989454-4npv2" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.818143 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd-combined-ca-bundle\") pod \"barbican-worker-fdcb67bff-tvnvp\" (UID: \"6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd\") " pod="openstack/barbican-worker-fdcb67bff-tvnvp" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.818166 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a624af3-f727-4d7e-8b59-6c45863bfcea-combined-ca-bundle\") pod \"barbican-keystone-listener-d94989454-4npv2\" (UID: \"0a624af3-f727-4d7e-8b59-6c45863bfcea\") " pod="openstack/barbican-keystone-listener-d94989454-4npv2" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.818193 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a624af3-f727-4d7e-8b59-6c45863bfcea-config-data\") pod \"barbican-keystone-listener-d94989454-4npv2\" (UID: \"0a624af3-f727-4d7e-8b59-6c45863bfcea\") " pod="openstack/barbican-keystone-listener-d94989454-4npv2" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.819845 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a624af3-f727-4d7e-8b59-6c45863bfcea-logs\") pod \"barbican-keystone-listener-d94989454-4npv2\" (UID: \"0a624af3-f727-4d7e-8b59-6c45863bfcea\") " pod="openstack/barbican-keystone-listener-d94989454-4npv2" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.820166 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-vwdxx"] Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.825034 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a624af3-f727-4d7e-8b59-6c45863bfcea-config-data-custom\") pod \"barbican-keystone-listener-d94989454-4npv2\" (UID: \"0a624af3-f727-4d7e-8b59-6c45863bfcea\") " pod="openstack/barbican-keystone-listener-d94989454-4npv2" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.826623 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a624af3-f727-4d7e-8b59-6c45863bfcea-config-data\") pod \"barbican-keystone-listener-d94989454-4npv2\" (UID: \"0a624af3-f727-4d7e-8b59-6c45863bfcea\") " pod="openstack/barbican-keystone-listener-d94989454-4npv2" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.836662 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a624af3-f727-4d7e-8b59-6c45863bfcea-combined-ca-bundle\") pod \"barbican-keystone-listener-d94989454-4npv2\" (UID: \"0a624af3-f727-4d7e-8b59-6c45863bfcea\") " pod="openstack/barbican-keystone-listener-d94989454-4npv2" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.847615 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kddqp\" (UniqueName: \"kubernetes.io/projected/0a624af3-f727-4d7e-8b59-6c45863bfcea-kube-api-access-kddqp\") pod \"barbican-keystone-listener-d94989454-4npv2\" (UID: \"0a624af3-f727-4d7e-8b59-6c45863bfcea\") " pod="openstack/barbican-keystone-listener-d94989454-4npv2" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.855695 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6f69977cdb-7mdds"] Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.857618 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.864456 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f69977cdb-7mdds"] Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.892312 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.919519 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd-config-data-custom\") pod \"barbican-worker-fdcb67bff-tvnvp\" (UID: \"6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd\") " pod="openstack/barbican-worker-fdcb67bff-tvnvp" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.919591 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba25d9f9-4136-49e7-9016-a77627808014-config-data-custom\") pod \"barbican-api-6f69977cdb-7mdds\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.919619 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-vwdxx\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.919648 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba25d9f9-4136-49e7-9016-a77627808014-logs\") pod \"barbican-api-6f69977cdb-7mdds\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.919665 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba25d9f9-4136-49e7-9016-a77627808014-config-data\") pod \"barbican-api-6f69977cdb-7mdds\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.919687 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-vwdxx\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.919711 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-dns-svc\") pod \"dnsmasq-dns-85ff748b95-vwdxx\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.919732 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqrp9\" (UniqueName: \"kubernetes.io/projected/6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd-kube-api-access-pqrp9\") pod \"barbican-worker-fdcb67bff-tvnvp\" (UID: \"6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd\") " pod="openstack/barbican-worker-fdcb67bff-tvnvp" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.919756 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9g84\" (UniqueName: \"kubernetes.io/projected/ba25d9f9-4136-49e7-9016-a77627808014-kube-api-access-m9g84\") pod \"barbican-api-6f69977cdb-7mdds\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.919773 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd-combined-ca-bundle\") pod \"barbican-worker-fdcb67bff-tvnvp\" (UID: \"6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd\") " pod="openstack/barbican-worker-fdcb67bff-tvnvp" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.919807 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba25d9f9-4136-49e7-9016-a77627808014-combined-ca-bundle\") pod \"barbican-api-6f69977cdb-7mdds\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.919857 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-vwdxx\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.919889 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-config\") pod \"dnsmasq-dns-85ff748b95-vwdxx\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.919921 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pdvd\" (UniqueName: \"kubernetes.io/projected/29b494b4-557b-4469-b18f-bef8d24e73b7-kube-api-access-8pdvd\") pod \"dnsmasq-dns-85ff748b95-vwdxx\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.919940 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd-config-data\") pod \"barbican-worker-fdcb67bff-tvnvp\" (UID: \"6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd\") " pod="openstack/barbican-worker-fdcb67bff-tvnvp" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.919962 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd-logs\") pod \"barbican-worker-fdcb67bff-tvnvp\" (UID: \"6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd\") " pod="openstack/barbican-worker-fdcb67bff-tvnvp" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.920317 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd-logs\") pod \"barbican-worker-fdcb67bff-tvnvp\" (UID: \"6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd\") " pod="openstack/barbican-worker-fdcb67bff-tvnvp" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.923585 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-vwdxx\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.923634 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-vwdxx\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.924414 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-vwdxx\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.925513 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-config\") pod \"dnsmasq-dns-85ff748b95-vwdxx\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.925788 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-dns-svc\") pod \"dnsmasq-dns-85ff748b95-vwdxx\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.931285 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd-combined-ca-bundle\") pod \"barbican-worker-fdcb67bff-tvnvp\" (UID: \"6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd\") " pod="openstack/barbican-worker-fdcb67bff-tvnvp" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.936940 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd-config-data\") pod \"barbican-worker-fdcb67bff-tvnvp\" (UID: \"6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd\") " pod="openstack/barbican-worker-fdcb67bff-tvnvp" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.946882 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd-config-data-custom\") pod \"barbican-worker-fdcb67bff-tvnvp\" (UID: \"6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd\") " pod="openstack/barbican-worker-fdcb67bff-tvnvp" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.950551 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqrp9\" (UniqueName: \"kubernetes.io/projected/6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd-kube-api-access-pqrp9\") pod \"barbican-worker-fdcb67bff-tvnvp\" (UID: \"6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd\") " pod="openstack/barbican-worker-fdcb67bff-tvnvp" Mar 13 09:32:13 crc kubenswrapper[4841]: I0313 09:32:13.954455 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pdvd\" (UniqueName: \"kubernetes.io/projected/29b494b4-557b-4469-b18f-bef8d24e73b7-kube-api-access-8pdvd\") pod \"dnsmasq-dns-85ff748b95-vwdxx\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:14 crc kubenswrapper[4841]: E0313 09:32:14.000521 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72841340_b4e1_4283_8eb0_10641fb61f62.slice/crio-eaacabceb3226d5435ee602e30f7679746b9e92b6cd645f9cd0bb7596f361ed7.scope\": RecentStats: unable to find data in memory cache]" Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.025051 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba25d9f9-4136-49e7-9016-a77627808014-config-data-custom\") pod \"barbican-api-6f69977cdb-7mdds\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.025165 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba25d9f9-4136-49e7-9016-a77627808014-logs\") pod \"barbican-api-6f69977cdb-7mdds\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.025185 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba25d9f9-4136-49e7-9016-a77627808014-config-data\") pod \"barbican-api-6f69977cdb-7mdds\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.025221 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9g84\" (UniqueName: \"kubernetes.io/projected/ba25d9f9-4136-49e7-9016-a77627808014-kube-api-access-m9g84\") pod \"barbican-api-6f69977cdb-7mdds\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.025254 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba25d9f9-4136-49e7-9016-a77627808014-combined-ca-bundle\") pod \"barbican-api-6f69977cdb-7mdds\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.026004 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba25d9f9-4136-49e7-9016-a77627808014-logs\") pod \"barbican-api-6f69977cdb-7mdds\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.030109 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba25d9f9-4136-49e7-9016-a77627808014-combined-ca-bundle\") pod \"barbican-api-6f69977cdb-7mdds\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.033337 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba25d9f9-4136-49e7-9016-a77627808014-config-data-custom\") pod \"barbican-api-6f69977cdb-7mdds\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.034934 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba25d9f9-4136-49e7-9016-a77627808014-config-data\") pod \"barbican-api-6f69977cdb-7mdds\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.035475 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-d94989454-4npv2" Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.048054 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9g84\" (UniqueName: \"kubernetes.io/projected/ba25d9f9-4136-49e7-9016-a77627808014-kube-api-access-m9g84\") pod \"barbican-api-6f69977cdb-7mdds\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.048459 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-fdcb67bff-tvnvp" Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.072406 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.291647 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.529177 4841 generic.go:334] "Generic (PLEG): container finished" podID="72841340-b4e1-4283-8eb0-10641fb61f62" containerID="eaacabceb3226d5435ee602e30f7679746b9e92b6cd645f9cd0bb7596f361ed7" exitCode=0 Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.529569 4841 generic.go:334] "Generic (PLEG): container finished" podID="72841340-b4e1-4283-8eb0-10641fb61f62" containerID="76d49f0bc8f91afba0952b95da61a6d529c69e7e2cf226ea47c6cfd3b2ff22fc" exitCode=2 Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.529400 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72841340-b4e1-4283-8eb0-10641fb61f62","Type":"ContainerDied","Data":"eaacabceb3226d5435ee602e30f7679746b9e92b6cd645f9cd0bb7596f361ed7"} Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.529670 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72841340-b4e1-4283-8eb0-10641fb61f62","Type":"ContainerDied","Data":"76d49f0bc8f91afba0952b95da61a6d529c69e7e2cf226ea47c6cfd3b2ff22fc"} Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.531743 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9nnr7" event={"ID":"5d602e6f-77e5-4496-b426-2c003dad63e4","Type":"ContainerStarted","Data":"02e39b1af8c6d548176c72640c9cbce8658bd7f178e42c1632b24d2512c3ae3d"} Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.549902 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-9nnr7" podStartSLOduration=3.899461855 podStartE2EDuration="47.54988223s" podCreationTimestamp="2026-03-13 09:31:27 +0000 UTC" firstStartedPulling="2026-03-13 09:31:29.218915856 +0000 UTC m=+1171.948816047" lastFinishedPulling="2026-03-13 09:32:12.869336231 +0000 UTC m=+1215.599236422" observedRunningTime="2026-03-13 09:32:14.548690454 +0000 UTC m=+1217.278590645" watchObservedRunningTime="2026-03-13 09:32:14.54988223 +0000 UTC m=+1217.279782421" Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.601933 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-d94989454-4npv2"] Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.652460 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-fdcb67bff-tvnvp"] Mar 13 09:32:14 crc kubenswrapper[4841]: W0313 09:32:14.762188 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29b494b4_557b_4469_b18f_bef8d24e73b7.slice/crio-1975cf69aa73660ac988bb26ace66723b9ae10c107c60125c16883024e372562 WatchSource:0}: Error finding container 1975cf69aa73660ac988bb26ace66723b9ae10c107c60125c16883024e372562: Status 404 returned error can't find the container with id 1975cf69aa73660ac988bb26ace66723b9ae10c107c60125c16883024e372562 Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.766163 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-vwdxx"] Mar 13 09:32:14 crc kubenswrapper[4841]: W0313 09:32:14.872019 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba25d9f9_4136_49e7_9016_a77627808014.slice/crio-54edfd00249a2d6d550c8cf3999058733f0c01a10af509b856b2573c319cefd7 WatchSource:0}: Error finding container 54edfd00249a2d6d550c8cf3999058733f0c01a10af509b856b2573c319cefd7: Status 404 returned error can't find the container with id 54edfd00249a2d6d550c8cf3999058733f0c01a10af509b856b2573c319cefd7 Mar 13 09:32:14 crc kubenswrapper[4841]: I0313 09:32:14.872722 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f69977cdb-7mdds"] Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.173952 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.250555 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-sg-core-conf-yaml\") pod \"72841340-b4e1-4283-8eb0-10641fb61f62\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.250824 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-scripts\") pod \"72841340-b4e1-4283-8eb0-10641fb61f62\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.250950 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72841340-b4e1-4283-8eb0-10641fb61f62-run-httpd\") pod \"72841340-b4e1-4283-8eb0-10641fb61f62\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.250997 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72841340-b4e1-4283-8eb0-10641fb61f62-log-httpd\") pod \"72841340-b4e1-4283-8eb0-10641fb61f62\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.251118 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvqt9\" (UniqueName: \"kubernetes.io/projected/72841340-b4e1-4283-8eb0-10641fb61f62-kube-api-access-vvqt9\") pod \"72841340-b4e1-4283-8eb0-10641fb61f62\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.251151 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-config-data\") pod \"72841340-b4e1-4283-8eb0-10641fb61f62\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.251173 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-combined-ca-bundle\") pod \"72841340-b4e1-4283-8eb0-10641fb61f62\" (UID: \"72841340-b4e1-4283-8eb0-10641fb61f62\") " Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.251630 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72841340-b4e1-4283-8eb0-10641fb61f62-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "72841340-b4e1-4283-8eb0-10641fb61f62" (UID: "72841340-b4e1-4283-8eb0-10641fb61f62"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.251662 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72841340-b4e1-4283-8eb0-10641fb61f62-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "72841340-b4e1-4283-8eb0-10641fb61f62" (UID: "72841340-b4e1-4283-8eb0-10641fb61f62"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.255967 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-scripts" (OuterVolumeSpecName: "scripts") pod "72841340-b4e1-4283-8eb0-10641fb61f62" (UID: "72841340-b4e1-4283-8eb0-10641fb61f62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.256683 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72841340-b4e1-4283-8eb0-10641fb61f62-kube-api-access-vvqt9" (OuterVolumeSpecName: "kube-api-access-vvqt9") pod "72841340-b4e1-4283-8eb0-10641fb61f62" (UID: "72841340-b4e1-4283-8eb0-10641fb61f62"). InnerVolumeSpecName "kube-api-access-vvqt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.280477 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "72841340-b4e1-4283-8eb0-10641fb61f62" (UID: "72841340-b4e1-4283-8eb0-10641fb61f62"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.304289 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72841340-b4e1-4283-8eb0-10641fb61f62" (UID: "72841340-b4e1-4283-8eb0-10641fb61f62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.332405 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-config-data" (OuterVolumeSpecName: "config-data") pod "72841340-b4e1-4283-8eb0-10641fb61f62" (UID: "72841340-b4e1-4283-8eb0-10641fb61f62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.353346 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.353395 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72841340-b4e1-4283-8eb0-10641fb61f62-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.353404 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72841340-b4e1-4283-8eb0-10641fb61f62-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.353415 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvqt9\" (UniqueName: \"kubernetes.io/projected/72841340-b4e1-4283-8eb0-10641fb61f62-kube-api-access-vvqt9\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.353428 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.353437 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.353445 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72841340-b4e1-4283-8eb0-10641fb61f62-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.541968 4841 generic.go:334] "Generic (PLEG): container finished" podID="72841340-b4e1-4283-8eb0-10641fb61f62" containerID="985af6398752095c1c7e6a1c79b4cc708bcc3b0865cdd0fa2b70a0fc724088d2" exitCode=0 Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.542031 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.542046 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72841340-b4e1-4283-8eb0-10641fb61f62","Type":"ContainerDied","Data":"985af6398752095c1c7e6a1c79b4cc708bcc3b0865cdd0fa2b70a0fc724088d2"} Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.542094 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72841340-b4e1-4283-8eb0-10641fb61f62","Type":"ContainerDied","Data":"314e66748aa4a3184d585c74f92f37c74e4269b18adc3facf3453d706f8f1d9f"} Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.542119 4841 scope.go:117] "RemoveContainer" containerID="eaacabceb3226d5435ee602e30f7679746b9e92b6cd645f9cd0bb7596f361ed7" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.546475 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f69977cdb-7mdds" event={"ID":"ba25d9f9-4136-49e7-9016-a77627808014","Type":"ContainerStarted","Data":"4d5adf299cfa5d67f73de4fe6cbdc637423680eeefaa47d2efde069d260da422"} Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.546514 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f69977cdb-7mdds" event={"ID":"ba25d9f9-4136-49e7-9016-a77627808014","Type":"ContainerStarted","Data":"3063274d726b3acaccdf0b261116fbc7042e9b251eed4b00e63d2cf2ceb19f9f"} Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.546524 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f69977cdb-7mdds" event={"ID":"ba25d9f9-4136-49e7-9016-a77627808014","Type":"ContainerStarted","Data":"54edfd00249a2d6d550c8cf3999058733f0c01a10af509b856b2573c319cefd7"} Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.547558 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.547583 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.549582 4841 generic.go:334] "Generic (PLEG): container finished" podID="29b494b4-557b-4469-b18f-bef8d24e73b7" containerID="bb5867b2bc0b4568bf07e4b1600ffb787dd00385d4e1363265a65efaa7d27ed1" exitCode=0 Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.549624 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" event={"ID":"29b494b4-557b-4469-b18f-bef8d24e73b7","Type":"ContainerDied","Data":"bb5867b2bc0b4568bf07e4b1600ffb787dd00385d4e1363265a65efaa7d27ed1"} Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.549639 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" event={"ID":"29b494b4-557b-4469-b18f-bef8d24e73b7","Type":"ContainerStarted","Data":"1975cf69aa73660ac988bb26ace66723b9ae10c107c60125c16883024e372562"} Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.551731 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d94989454-4npv2" event={"ID":"0a624af3-f727-4d7e-8b59-6c45863bfcea","Type":"ContainerStarted","Data":"943796769b48a2504c29da6ef5ed4027af3b16194b1ceb54f4b7600538890570"} Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.555459 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-fdcb67bff-tvnvp" event={"ID":"6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd","Type":"ContainerStarted","Data":"07299f76ea9fbb9c17588fc08f55619b16e7b6c1ee7be020f2579bf95bb19487"} Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.568199 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6f69977cdb-7mdds" podStartSLOduration=2.568181195 podStartE2EDuration="2.568181195s" podCreationTimestamp="2026-03-13 09:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:32:15.563389599 +0000 UTC m=+1218.293289790" watchObservedRunningTime="2026-03-13 09:32:15.568181195 +0000 UTC m=+1218.298081386" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.598115 4841 scope.go:117] "RemoveContainer" containerID="76d49f0bc8f91afba0952b95da61a6d529c69e7e2cf226ea47c6cfd3b2ff22fc" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.686112 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.699350 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.722322 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:32:15 crc kubenswrapper[4841]: E0313 09:32:15.722790 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72841340-b4e1-4283-8eb0-10641fb61f62" containerName="ceilometer-notification-agent" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.722814 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="72841340-b4e1-4283-8eb0-10641fb61f62" containerName="ceilometer-notification-agent" Mar 13 09:32:15 crc kubenswrapper[4841]: E0313 09:32:15.722827 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72841340-b4e1-4283-8eb0-10641fb61f62" containerName="sg-core" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.722835 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="72841340-b4e1-4283-8eb0-10641fb61f62" containerName="sg-core" Mar 13 09:32:15 crc kubenswrapper[4841]: E0313 09:32:15.722852 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72841340-b4e1-4283-8eb0-10641fb61f62" containerName="proxy-httpd" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.722861 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="72841340-b4e1-4283-8eb0-10641fb61f62" containerName="proxy-httpd" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.723028 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="72841340-b4e1-4283-8eb0-10641fb61f62" containerName="sg-core" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.723052 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="72841340-b4e1-4283-8eb0-10641fb61f62" containerName="ceilometer-notification-agent" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.723069 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="72841340-b4e1-4283-8eb0-10641fb61f62" containerName="proxy-httpd" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.725057 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.730551 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.732587 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.763595 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/212126dc-6eaf-498b-b5db-2f24ed74e6a0-log-httpd\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.763728 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.763758 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-scripts\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.763783 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.763819 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj9p9\" (UniqueName: \"kubernetes.io/projected/212126dc-6eaf-498b-b5db-2f24ed74e6a0-kube-api-access-cj9p9\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.763940 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-config-data\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.763972 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/212126dc-6eaf-498b-b5db-2f24ed74e6a0-run-httpd\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.772580 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.865166 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-config-data\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.865289 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/212126dc-6eaf-498b-b5db-2f24ed74e6a0-run-httpd\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.865311 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/212126dc-6eaf-498b-b5db-2f24ed74e6a0-log-httpd\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.865987 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/212126dc-6eaf-498b-b5db-2f24ed74e6a0-log-httpd\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.866408 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/212126dc-6eaf-498b-b5db-2f24ed74e6a0-run-httpd\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.866494 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.866521 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-scripts\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.866542 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.866571 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj9p9\" (UniqueName: \"kubernetes.io/projected/212126dc-6eaf-498b-b5db-2f24ed74e6a0-kube-api-access-cj9p9\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.870337 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-scripts\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.871177 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-config-data\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.872255 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.872500 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:15 crc kubenswrapper[4841]: I0313 09:32:15.885795 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj9p9\" (UniqueName: \"kubernetes.io/projected/212126dc-6eaf-498b-b5db-2f24ed74e6a0-kube-api-access-cj9p9\") pod \"ceilometer-0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " pod="openstack/ceilometer-0" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.010370 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72841340-b4e1-4283-8eb0-10641fb61f62" path="/var/lib/kubelet/pods/72841340-b4e1-4283-8eb0-10641fb61f62/volumes" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.051938 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.504686 4841 scope.go:117] "RemoveContainer" containerID="985af6398752095c1c7e6a1c79b4cc708bcc3b0865cdd0fa2b70a0fc724088d2" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.609043 4841 scope.go:117] "RemoveContainer" containerID="eaacabceb3226d5435ee602e30f7679746b9e92b6cd645f9cd0bb7596f361ed7" Mar 13 09:32:16 crc kubenswrapper[4841]: E0313 09:32:16.609983 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaacabceb3226d5435ee602e30f7679746b9e92b6cd645f9cd0bb7596f361ed7\": container with ID starting with eaacabceb3226d5435ee602e30f7679746b9e92b6cd645f9cd0bb7596f361ed7 not found: ID does not exist" containerID="eaacabceb3226d5435ee602e30f7679746b9e92b6cd645f9cd0bb7596f361ed7" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.610037 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaacabceb3226d5435ee602e30f7679746b9e92b6cd645f9cd0bb7596f361ed7"} err="failed to get container status \"eaacabceb3226d5435ee602e30f7679746b9e92b6cd645f9cd0bb7596f361ed7\": rpc error: code = NotFound desc = could not find container \"eaacabceb3226d5435ee602e30f7679746b9e92b6cd645f9cd0bb7596f361ed7\": container with ID starting with eaacabceb3226d5435ee602e30f7679746b9e92b6cd645f9cd0bb7596f361ed7 not found: ID does not exist" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.610096 4841 scope.go:117] "RemoveContainer" containerID="76d49f0bc8f91afba0952b95da61a6d529c69e7e2cf226ea47c6cfd3b2ff22fc" Mar 13 09:32:16 crc kubenswrapper[4841]: E0313 09:32:16.611850 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76d49f0bc8f91afba0952b95da61a6d529c69e7e2cf226ea47c6cfd3b2ff22fc\": container with ID starting with 76d49f0bc8f91afba0952b95da61a6d529c69e7e2cf226ea47c6cfd3b2ff22fc not found: ID does not exist" containerID="76d49f0bc8f91afba0952b95da61a6d529c69e7e2cf226ea47c6cfd3b2ff22fc" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.611886 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76d49f0bc8f91afba0952b95da61a6d529c69e7e2cf226ea47c6cfd3b2ff22fc"} err="failed to get container status \"76d49f0bc8f91afba0952b95da61a6d529c69e7e2cf226ea47c6cfd3b2ff22fc\": rpc error: code = NotFound desc = could not find container \"76d49f0bc8f91afba0952b95da61a6d529c69e7e2cf226ea47c6cfd3b2ff22fc\": container with ID starting with 76d49f0bc8f91afba0952b95da61a6d529c69e7e2cf226ea47c6cfd3b2ff22fc not found: ID does not exist" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.611933 4841 scope.go:117] "RemoveContainer" containerID="985af6398752095c1c7e6a1c79b4cc708bcc3b0865cdd0fa2b70a0fc724088d2" Mar 13 09:32:16 crc kubenswrapper[4841]: E0313 09:32:16.612416 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"985af6398752095c1c7e6a1c79b4cc708bcc3b0865cdd0fa2b70a0fc724088d2\": container with ID starting with 985af6398752095c1c7e6a1c79b4cc708bcc3b0865cdd0fa2b70a0fc724088d2 not found: ID does not exist" containerID="985af6398752095c1c7e6a1c79b4cc708bcc3b0865cdd0fa2b70a0fc724088d2" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.612457 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985af6398752095c1c7e6a1c79b4cc708bcc3b0865cdd0fa2b70a0fc724088d2"} err="failed to get container status \"985af6398752095c1c7e6a1c79b4cc708bcc3b0865cdd0fa2b70a0fc724088d2\": rpc error: code = NotFound desc = could not find container \"985af6398752095c1c7e6a1c79b4cc708bcc3b0865cdd0fa2b70a0fc724088d2\": container with ID starting with 985af6398752095c1c7e6a1c79b4cc708bcc3b0865cdd0fa2b70a0fc724088d2 not found: ID does not exist" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.872523 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5dbfbd46f8-tjjrf"] Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.876073 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.883036 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5dbfbd46f8-tjjrf"] Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.884167 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.885842 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.994232 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728289d9-1ed1-449a-99e7-85da0a025366-logs\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.994476 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/728289d9-1ed1-449a-99e7-85da0a025366-public-tls-certs\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.994521 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728289d9-1ed1-449a-99e7-85da0a025366-config-data\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.994568 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728289d9-1ed1-449a-99e7-85da0a025366-combined-ca-bundle\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.994852 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/728289d9-1ed1-449a-99e7-85da0a025366-internal-tls-certs\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.995802 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/728289d9-1ed1-449a-99e7-85da0a025366-config-data-custom\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:16 crc kubenswrapper[4841]: I0313 09:32:16.996195 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kws7g\" (UniqueName: \"kubernetes.io/projected/728289d9-1ed1-449a-99e7-85da0a025366-kube-api-access-kws7g\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.097749 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728289d9-1ed1-449a-99e7-85da0a025366-config-data\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.097876 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728289d9-1ed1-449a-99e7-85da0a025366-combined-ca-bundle\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.097909 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/728289d9-1ed1-449a-99e7-85da0a025366-internal-tls-certs\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.097944 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/728289d9-1ed1-449a-99e7-85da0a025366-config-data-custom\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.098029 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kws7g\" (UniqueName: \"kubernetes.io/projected/728289d9-1ed1-449a-99e7-85da0a025366-kube-api-access-kws7g\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.098078 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728289d9-1ed1-449a-99e7-85da0a025366-logs\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.098192 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/728289d9-1ed1-449a-99e7-85da0a025366-public-tls-certs\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.099783 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728289d9-1ed1-449a-99e7-85da0a025366-logs\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.102196 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/728289d9-1ed1-449a-99e7-85da0a025366-config-data-custom\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.102625 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/728289d9-1ed1-449a-99e7-85da0a025366-public-tls-certs\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.122974 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728289d9-1ed1-449a-99e7-85da0a025366-combined-ca-bundle\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.126039 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/728289d9-1ed1-449a-99e7-85da0a025366-internal-tls-certs\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.131434 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728289d9-1ed1-449a-99e7-85da0a025366-config-data\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.134210 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:32:17 crc kubenswrapper[4841]: W0313 09:32:17.135090 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod212126dc_6eaf_498b_b5db_2f24ed74e6a0.slice/crio-97804d476c73068cedf59c0658a157b87f3cac66a66b2a66707de4f82f027b92 WatchSource:0}: Error finding container 97804d476c73068cedf59c0658a157b87f3cac66a66b2a66707de4f82f027b92: Status 404 returned error can't find the container with id 97804d476c73068cedf59c0658a157b87f3cac66a66b2a66707de4f82f027b92 Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.135511 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kws7g\" (UniqueName: \"kubernetes.io/projected/728289d9-1ed1-449a-99e7-85da0a025366-kube-api-access-kws7g\") pod \"barbican-api-5dbfbd46f8-tjjrf\" (UID: \"728289d9-1ed1-449a-99e7-85da0a025366\") " pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.194379 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.581472 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"212126dc-6eaf-498b-b5db-2f24ed74e6a0","Type":"ContainerStarted","Data":"97804d476c73068cedf59c0658a157b87f3cac66a66b2a66707de4f82f027b92"} Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.583242 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d94989454-4npv2" event={"ID":"0a624af3-f727-4d7e-8b59-6c45863bfcea","Type":"ContainerStarted","Data":"da620b233fcb7c8280f05cb9136c8e45f4ff33661d985232a2bd492472cc7657"} Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.583320 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d94989454-4npv2" event={"ID":"0a624af3-f727-4d7e-8b59-6c45863bfcea","Type":"ContainerStarted","Data":"4685ead9fd3a15f0a0e2166215715d4ff7a016499a578a9085a64b3565ced577"} Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.585088 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-fdcb67bff-tvnvp" event={"ID":"6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd","Type":"ContainerStarted","Data":"6d035703e30944ad655f5f4d0132c3919653e1b9d0403cc4840eac1f64310de0"} Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.585138 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-fdcb67bff-tvnvp" event={"ID":"6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd","Type":"ContainerStarted","Data":"dcb38c5325a9d4569660ed7dfac21121cf25e3bc590631da36e9556dbaf90aff"} Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.589488 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" event={"ID":"29b494b4-557b-4469-b18f-bef8d24e73b7","Type":"ContainerStarted","Data":"daf3d003cbea5e5f4bbf0deb461882e6ffca7a97a200eda30b0aa2fa7486a580"} Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.606590 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-d94989454-4npv2" podStartSLOduration=2.606028992 podStartE2EDuration="4.606569091s" podCreationTimestamp="2026-03-13 09:32:13 +0000 UTC" firstStartedPulling="2026-03-13 09:32:14.64278728 +0000 UTC m=+1217.372687461" lastFinishedPulling="2026-03-13 09:32:16.643327329 +0000 UTC m=+1219.373227560" observedRunningTime="2026-03-13 09:32:17.604217279 +0000 UTC m=+1220.334117480" watchObservedRunningTime="2026-03-13 09:32:17.606569091 +0000 UTC m=+1220.336469302" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.632338 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" podStartSLOduration=4.632314768 podStartE2EDuration="4.632314768s" podCreationTimestamp="2026-03-13 09:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:32:17.628218704 +0000 UTC m=+1220.358118905" watchObservedRunningTime="2026-03-13 09:32:17.632314768 +0000 UTC m=+1220.362214979" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.667013 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-fdcb67bff-tvnvp" podStartSLOduration=2.667231183 podStartE2EDuration="4.666987008s" podCreationTimestamp="2026-03-13 09:32:13 +0000 UTC" firstStartedPulling="2026-03-13 09:32:14.642340866 +0000 UTC m=+1217.372241057" lastFinishedPulling="2026-03-13 09:32:16.642096681 +0000 UTC m=+1219.371996882" observedRunningTime="2026-03-13 09:32:17.649860325 +0000 UTC m=+1220.379760506" watchObservedRunningTime="2026-03-13 09:32:17.666987008 +0000 UTC m=+1220.396887199" Mar 13 09:32:17 crc kubenswrapper[4841]: I0313 09:32:17.705793 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5dbfbd46f8-tjjrf"] Mar 13 09:32:18 crc kubenswrapper[4841]: I0313 09:32:18.601796 4841 generic.go:334] "Generic (PLEG): container finished" podID="2038e7ba-1de4-49b4-95dd-b2f3cde7be45" containerID="58fdcfcf849ffb97d7153cf927d93c32acae4959d41ed0b6e7563f6a919ce267" exitCode=0 Mar 13 09:32:18 crc kubenswrapper[4841]: I0313 09:32:18.601863 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-q6zfh" event={"ID":"2038e7ba-1de4-49b4-95dd-b2f3cde7be45","Type":"ContainerDied","Data":"58fdcfcf849ffb97d7153cf927d93c32acae4959d41ed0b6e7563f6a919ce267"} Mar 13 09:32:18 crc kubenswrapper[4841]: I0313 09:32:18.608007 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"212126dc-6eaf-498b-b5db-2f24ed74e6a0","Type":"ContainerStarted","Data":"4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3"} Mar 13 09:32:18 crc kubenswrapper[4841]: I0313 09:32:18.612834 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dbfbd46f8-tjjrf" event={"ID":"728289d9-1ed1-449a-99e7-85da0a025366","Type":"ContainerStarted","Data":"6f133c87bf390b1875ad933aa2ad85f08dd0d0bdff33bc14a3d5a8109dff06df"} Mar 13 09:32:18 crc kubenswrapper[4841]: I0313 09:32:18.612873 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dbfbd46f8-tjjrf" event={"ID":"728289d9-1ed1-449a-99e7-85da0a025366","Type":"ContainerStarted","Data":"bfb6d3424552a74b7ed2b973f5696be6bbff6bce94ce8f551e6ca2e7dd67f55f"} Mar 13 09:32:18 crc kubenswrapper[4841]: I0313 09:32:18.612892 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dbfbd46f8-tjjrf" event={"ID":"728289d9-1ed1-449a-99e7-85da0a025366","Type":"ContainerStarted","Data":"5c9b7f0a4f0426a5e3288fcc30478f49994a1d8992ebb306d02c1430bcf8852c"} Mar 13 09:32:18 crc kubenswrapper[4841]: I0313 09:32:18.613737 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:18 crc kubenswrapper[4841]: I0313 09:32:18.663670 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5dbfbd46f8-tjjrf" podStartSLOduration=2.663651833 podStartE2EDuration="2.663651833s" podCreationTimestamp="2026-03-13 09:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:32:18.645009152 +0000 UTC m=+1221.374909363" watchObservedRunningTime="2026-03-13 09:32:18.663651833 +0000 UTC m=+1221.393552024" Mar 13 09:32:19 crc kubenswrapper[4841]: I0313 09:32:19.621859 4841 generic.go:334] "Generic (PLEG): container finished" podID="5d602e6f-77e5-4496-b426-2c003dad63e4" containerID="02e39b1af8c6d548176c72640c9cbce8658bd7f178e42c1632b24d2512c3ae3d" exitCode=0 Mar 13 09:32:19 crc kubenswrapper[4841]: I0313 09:32:19.621948 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9nnr7" event={"ID":"5d602e6f-77e5-4496-b426-2c003dad63e4","Type":"ContainerDied","Data":"02e39b1af8c6d548176c72640c9cbce8658bd7f178e42c1632b24d2512c3ae3d"} Mar 13 09:32:19 crc kubenswrapper[4841]: I0313 09:32:19.625679 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"212126dc-6eaf-498b-b5db-2f24ed74e6a0","Type":"ContainerStarted","Data":"9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e"} Mar 13 09:32:19 crc kubenswrapper[4841]: I0313 09:32:19.626184 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:19 crc kubenswrapper[4841]: I0313 09:32:19.626249 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"212126dc-6eaf-498b-b5db-2f24ed74e6a0","Type":"ContainerStarted","Data":"11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7"} Mar 13 09:32:19 crc kubenswrapper[4841]: I0313 09:32:19.626332 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:20 crc kubenswrapper[4841]: I0313 09:32:20.050743 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-q6zfh" Mar 13 09:32:20 crc kubenswrapper[4841]: I0313 09:32:20.171995 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2038e7ba-1de4-49b4-95dd-b2f3cde7be45-config-data\") pod \"2038e7ba-1de4-49b4-95dd-b2f3cde7be45\" (UID: \"2038e7ba-1de4-49b4-95dd-b2f3cde7be45\") " Mar 13 09:32:20 crc kubenswrapper[4841]: I0313 09:32:20.172068 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2038e7ba-1de4-49b4-95dd-b2f3cde7be45-combined-ca-bundle\") pod \"2038e7ba-1de4-49b4-95dd-b2f3cde7be45\" (UID: \"2038e7ba-1de4-49b4-95dd-b2f3cde7be45\") " Mar 13 09:32:20 crc kubenswrapper[4841]: I0313 09:32:20.172204 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gdks\" (UniqueName: \"kubernetes.io/projected/2038e7ba-1de4-49b4-95dd-b2f3cde7be45-kube-api-access-6gdks\") pod \"2038e7ba-1de4-49b4-95dd-b2f3cde7be45\" (UID: \"2038e7ba-1de4-49b4-95dd-b2f3cde7be45\") " Mar 13 09:32:20 crc kubenswrapper[4841]: I0313 09:32:20.176945 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2038e7ba-1de4-49b4-95dd-b2f3cde7be45-kube-api-access-6gdks" (OuterVolumeSpecName: "kube-api-access-6gdks") pod "2038e7ba-1de4-49b4-95dd-b2f3cde7be45" (UID: "2038e7ba-1de4-49b4-95dd-b2f3cde7be45"). InnerVolumeSpecName "kube-api-access-6gdks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:20 crc kubenswrapper[4841]: I0313 09:32:20.195250 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2038e7ba-1de4-49b4-95dd-b2f3cde7be45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2038e7ba-1de4-49b4-95dd-b2f3cde7be45" (UID: "2038e7ba-1de4-49b4-95dd-b2f3cde7be45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:20 crc kubenswrapper[4841]: I0313 09:32:20.265545 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2038e7ba-1de4-49b4-95dd-b2f3cde7be45-config-data" (OuterVolumeSpecName: "config-data") pod "2038e7ba-1de4-49b4-95dd-b2f3cde7be45" (UID: "2038e7ba-1de4-49b4-95dd-b2f3cde7be45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:20 crc kubenswrapper[4841]: I0313 09:32:20.274737 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2038e7ba-1de4-49b4-95dd-b2f3cde7be45-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:20 crc kubenswrapper[4841]: I0313 09:32:20.274778 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2038e7ba-1de4-49b4-95dd-b2f3cde7be45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:20 crc kubenswrapper[4841]: I0313 09:32:20.274792 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gdks\" (UniqueName: \"kubernetes.io/projected/2038e7ba-1de4-49b4-95dd-b2f3cde7be45-kube-api-access-6gdks\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:20 crc kubenswrapper[4841]: I0313 09:32:20.655449 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-q6zfh" event={"ID":"2038e7ba-1de4-49b4-95dd-b2f3cde7be45","Type":"ContainerDied","Data":"91bc4d009ae0adb93dc9095a1922dc6b2e0981ac31351d5e1d236cf9279cb77e"} Mar 13 09:32:20 crc kubenswrapper[4841]: I0313 09:32:20.655493 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91bc4d009ae0adb93dc9095a1922dc6b2e0981ac31351d5e1d236cf9279cb77e" Mar 13 09:32:20 crc kubenswrapper[4841]: I0313 09:32:20.655824 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-q6zfh" Mar 13 09:32:20 crc kubenswrapper[4841]: I0313 09:32:20.981720 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.097917 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d602e6f-77e5-4496-b426-2c003dad63e4-etc-machine-id\") pod \"5d602e6f-77e5-4496-b426-2c003dad63e4\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.098018 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-combined-ca-bundle\") pod \"5d602e6f-77e5-4496-b426-2c003dad63e4\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.098103 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-db-sync-config-data\") pod \"5d602e6f-77e5-4496-b426-2c003dad63e4\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.098154 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhm7f\" (UniqueName: \"kubernetes.io/projected/5d602e6f-77e5-4496-b426-2c003dad63e4-kube-api-access-qhm7f\") pod \"5d602e6f-77e5-4496-b426-2c003dad63e4\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.098210 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-config-data\") pod \"5d602e6f-77e5-4496-b426-2c003dad63e4\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.098256 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-scripts\") pod \"5d602e6f-77e5-4496-b426-2c003dad63e4\" (UID: \"5d602e6f-77e5-4496-b426-2c003dad63e4\") " Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.103306 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d602e6f-77e5-4496-b426-2c003dad63e4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5d602e6f-77e5-4496-b426-2c003dad63e4" (UID: "5d602e6f-77e5-4496-b426-2c003dad63e4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.103690 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-scripts" (OuterVolumeSpecName: "scripts") pod "5d602e6f-77e5-4496-b426-2c003dad63e4" (UID: "5d602e6f-77e5-4496-b426-2c003dad63e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.105585 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d602e6f-77e5-4496-b426-2c003dad63e4-kube-api-access-qhm7f" (OuterVolumeSpecName: "kube-api-access-qhm7f") pod "5d602e6f-77e5-4496-b426-2c003dad63e4" (UID: "5d602e6f-77e5-4496-b426-2c003dad63e4"). InnerVolumeSpecName "kube-api-access-qhm7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.106481 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5d602e6f-77e5-4496-b426-2c003dad63e4" (UID: "5d602e6f-77e5-4496-b426-2c003dad63e4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.138353 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d602e6f-77e5-4496-b426-2c003dad63e4" (UID: "5d602e6f-77e5-4496-b426-2c003dad63e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.156299 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-config-data" (OuterVolumeSpecName: "config-data") pod "5d602e6f-77e5-4496-b426-2c003dad63e4" (UID: "5d602e6f-77e5-4496-b426-2c003dad63e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.200442 4841 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d602e6f-77e5-4496-b426-2c003dad63e4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.200754 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.200875 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhm7f\" (UniqueName: \"kubernetes.io/projected/5d602e6f-77e5-4496-b426-2c003dad63e4-kube-api-access-qhm7f\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.201024 4841 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.201139 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.201246 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d602e6f-77e5-4496-b426-2c003dad63e4-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.667545 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9nnr7" event={"ID":"5d602e6f-77e5-4496-b426-2c003dad63e4","Type":"ContainerDied","Data":"067cc3a2a3c41d4b61f67df2132488adc85c1d0a138838543c38e05474cf96c0"} Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.667871 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="067cc3a2a3c41d4b61f67df2132488adc85c1d0a138838543c38e05474cf96c0" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.667897 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9nnr7" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.670473 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"212126dc-6eaf-498b-b5db-2f24ed74e6a0","Type":"ContainerStarted","Data":"4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d"} Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.670632 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.958041 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.5490961519999997 podStartE2EDuration="6.95802208s" podCreationTimestamp="2026-03-13 09:32:15 +0000 UTC" firstStartedPulling="2026-03-13 09:32:17.13736943 +0000 UTC m=+1219.867269621" lastFinishedPulling="2026-03-13 09:32:20.546295328 +0000 UTC m=+1223.276195549" observedRunningTime="2026-03-13 09:32:21.722174121 +0000 UTC m=+1224.452074312" watchObservedRunningTime="2026-03-13 09:32:21.95802208 +0000 UTC m=+1224.687922271" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.963700 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-vwdxx"] Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.963941 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" podUID="29b494b4-557b-4469-b18f-bef8d24e73b7" containerName="dnsmasq-dns" containerID="cri-o://daf3d003cbea5e5f4bbf0deb461882e6ffca7a97a200eda30b0aa2fa7486a580" gracePeriod=10 Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.965460 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.976191 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 09:32:21 crc kubenswrapper[4841]: E0313 09:32:21.976562 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2038e7ba-1de4-49b4-95dd-b2f3cde7be45" containerName="heat-db-sync" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.976577 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2038e7ba-1de4-49b4-95dd-b2f3cde7be45" containerName="heat-db-sync" Mar 13 09:32:21 crc kubenswrapper[4841]: E0313 09:32:21.976606 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d602e6f-77e5-4496-b426-2c003dad63e4" containerName="cinder-db-sync" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.976611 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d602e6f-77e5-4496-b426-2c003dad63e4" containerName="cinder-db-sync" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.976776 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d602e6f-77e5-4496-b426-2c003dad63e4" containerName="cinder-db-sync" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.976800 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="2038e7ba-1de4-49b4-95dd-b2f3cde7be45" containerName="heat-db-sync" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.977830 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.987126 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.987334 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.987508 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 09:32:21 crc kubenswrapper[4841]: I0313 09:32:21.987620 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-52rnh" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.016446 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.017164 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-scripts\") pod \"cinder-scheduler-0\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.017208 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkw5l\" (UniqueName: \"kubernetes.io/projected/268afc1a-8785-48f7-9299-4d47d14f6ad2-kube-api-access-vkw5l\") pod \"cinder-scheduler-0\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.017232 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/268afc1a-8785-48f7-9299-4d47d14f6ad2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.017322 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-config-data\") pod \"cinder-scheduler-0\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.017342 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.017404 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.032909 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-trm7n"] Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.034459 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.074904 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-trm7n"] Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.116353 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.119418 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.120579 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-trm7n\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.120635 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-trm7n\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.120685 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-scripts\") pod \"cinder-scheduler-0\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.120709 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkw5l\" (UniqueName: \"kubernetes.io/projected/268afc1a-8785-48f7-9299-4d47d14f6ad2-kube-api-access-vkw5l\") pod \"cinder-scheduler-0\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.120734 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/268afc1a-8785-48f7-9299-4d47d14f6ad2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.120753 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-trm7n\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.120832 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-trm7n\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.120857 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-config-data\") pod \"cinder-scheduler-0\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.120877 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.120907 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.120930 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-config\") pod \"dnsmasq-dns-5c9776ccc5-trm7n\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.120945 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbxtj\" (UniqueName: \"kubernetes.io/projected/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-kube-api-access-nbxtj\") pod \"dnsmasq-dns-5c9776ccc5-trm7n\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.122103 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.123088 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/268afc1a-8785-48f7-9299-4d47d14f6ad2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.144093 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.144651 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.145676 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-config-data\") pod \"cinder-scheduler-0\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.145997 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.156202 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkw5l\" (UniqueName: \"kubernetes.io/projected/268afc1a-8785-48f7-9299-4d47d14f6ad2-kube-api-access-vkw5l\") pod \"cinder-scheduler-0\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.168730 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-scripts\") pod \"cinder-scheduler-0\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.223998 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/132b8971-5eed-4201-87b2-0a235592346e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.224904 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-trm7n\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.224962 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-config\") pod \"dnsmasq-dns-5c9776ccc5-trm7n\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.224991 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbxtj\" (UniqueName: \"kubernetes.io/projected/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-kube-api-access-nbxtj\") pod \"dnsmasq-dns-5c9776ccc5-trm7n\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.225033 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-config-data-custom\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.225081 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-config-data\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.225110 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.225141 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-trm7n\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.225197 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-trm7n\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.225256 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-scripts\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.225303 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-trm7n\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.225348 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74krb\" (UniqueName: \"kubernetes.io/projected/132b8971-5eed-4201-87b2-0a235592346e-kube-api-access-74krb\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.225367 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/132b8971-5eed-4201-87b2-0a235592346e-logs\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.226003 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-trm7n\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.227085 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-trm7n\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.227695 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-config\") pod \"dnsmasq-dns-5c9776ccc5-trm7n\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.228333 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-trm7n\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.228721 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-trm7n\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.262302 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbxtj\" (UniqueName: \"kubernetes.io/projected/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-kube-api-access-nbxtj\") pod \"dnsmasq-dns-5c9776ccc5-trm7n\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.298247 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.327608 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-config-data-custom\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.327698 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-config-data\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.327728 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.327812 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-scripts\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.327879 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/132b8971-5eed-4201-87b2-0a235592346e-logs\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.327902 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74krb\" (UniqueName: \"kubernetes.io/projected/132b8971-5eed-4201-87b2-0a235592346e-kube-api-access-74krb\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.328006 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/132b8971-5eed-4201-87b2-0a235592346e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.328135 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/132b8971-5eed-4201-87b2-0a235592346e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.341198 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-config-data-custom\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.349222 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/132b8971-5eed-4201-87b2-0a235592346e-logs\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.349935 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.356295 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-config-data\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.361292 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-scripts\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.364717 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.374007 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74krb\" (UniqueName: \"kubernetes.io/projected/132b8971-5eed-4201-87b2-0a235592346e-kube-api-access-74krb\") pod \"cinder-api-0\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.492451 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.535786 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-ovsdbserver-nb\") pod \"29b494b4-557b-4469-b18f-bef8d24e73b7\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.535851 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-config\") pod \"29b494b4-557b-4469-b18f-bef8d24e73b7\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.535916 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-ovsdbserver-sb\") pod \"29b494b4-557b-4469-b18f-bef8d24e73b7\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.535947 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-dns-svc\") pod \"29b494b4-557b-4469-b18f-bef8d24e73b7\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.535975 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pdvd\" (UniqueName: \"kubernetes.io/projected/29b494b4-557b-4469-b18f-bef8d24e73b7-kube-api-access-8pdvd\") pod \"29b494b4-557b-4469-b18f-bef8d24e73b7\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.536082 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-dns-swift-storage-0\") pod \"29b494b4-557b-4469-b18f-bef8d24e73b7\" (UID: \"29b494b4-557b-4469-b18f-bef8d24e73b7\") " Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.546451 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b494b4-557b-4469-b18f-bef8d24e73b7-kube-api-access-8pdvd" (OuterVolumeSpecName: "kube-api-access-8pdvd") pod "29b494b4-557b-4469-b18f-bef8d24e73b7" (UID: "29b494b4-557b-4469-b18f-bef8d24e73b7"). InnerVolumeSpecName "kube-api-access-8pdvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.587972 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "29b494b4-557b-4469-b18f-bef8d24e73b7" (UID: "29b494b4-557b-4469-b18f-bef8d24e73b7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.589315 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "29b494b4-557b-4469-b18f-bef8d24e73b7" (UID: "29b494b4-557b-4469-b18f-bef8d24e73b7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.606733 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.612835 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "29b494b4-557b-4469-b18f-bef8d24e73b7" (UID: "29b494b4-557b-4469-b18f-bef8d24e73b7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.621693 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-config" (OuterVolumeSpecName: "config") pod "29b494b4-557b-4469-b18f-bef8d24e73b7" (UID: "29b494b4-557b-4469-b18f-bef8d24e73b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.637675 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "29b494b4-557b-4469-b18f-bef8d24e73b7" (UID: "29b494b4-557b-4469-b18f-bef8d24e73b7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.644786 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.644815 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.644825 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.644833 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.644841 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b494b4-557b-4469-b18f-bef8d24e73b7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.644849 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pdvd\" (UniqueName: \"kubernetes.io/projected/29b494b4-557b-4469-b18f-bef8d24e73b7-kube-api-access-8pdvd\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.690564 4841 generic.go:334] "Generic (PLEG): container finished" podID="29b494b4-557b-4469-b18f-bef8d24e73b7" containerID="daf3d003cbea5e5f4bbf0deb461882e6ffca7a97a200eda30b0aa2fa7486a580" exitCode=0 Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.691430 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.692350 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" event={"ID":"29b494b4-557b-4469-b18f-bef8d24e73b7","Type":"ContainerDied","Data":"daf3d003cbea5e5f4bbf0deb461882e6ffca7a97a200eda30b0aa2fa7486a580"} Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.692408 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-vwdxx" event={"ID":"29b494b4-557b-4469-b18f-bef8d24e73b7","Type":"ContainerDied","Data":"1975cf69aa73660ac988bb26ace66723b9ae10c107c60125c16883024e372562"} Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.692431 4841 scope.go:117] "RemoveContainer" containerID="daf3d003cbea5e5f4bbf0deb461882e6ffca7a97a200eda30b0aa2fa7486a580" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.736626 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-vwdxx"] Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.746645 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-vwdxx"] Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.754745 4841 scope.go:117] "RemoveContainer" containerID="bb5867b2bc0b4568bf07e4b1600ffb787dd00385d4e1363265a65efaa7d27ed1" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.784832 4841 scope.go:117] "RemoveContainer" containerID="daf3d003cbea5e5f4bbf0deb461882e6ffca7a97a200eda30b0aa2fa7486a580" Mar 13 09:32:22 crc kubenswrapper[4841]: E0313 09:32:22.785703 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daf3d003cbea5e5f4bbf0deb461882e6ffca7a97a200eda30b0aa2fa7486a580\": container with ID starting with daf3d003cbea5e5f4bbf0deb461882e6ffca7a97a200eda30b0aa2fa7486a580 not found: ID does not exist" containerID="daf3d003cbea5e5f4bbf0deb461882e6ffca7a97a200eda30b0aa2fa7486a580" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.785736 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daf3d003cbea5e5f4bbf0deb461882e6ffca7a97a200eda30b0aa2fa7486a580"} err="failed to get container status \"daf3d003cbea5e5f4bbf0deb461882e6ffca7a97a200eda30b0aa2fa7486a580\": rpc error: code = NotFound desc = could not find container \"daf3d003cbea5e5f4bbf0deb461882e6ffca7a97a200eda30b0aa2fa7486a580\": container with ID starting with daf3d003cbea5e5f4bbf0deb461882e6ffca7a97a200eda30b0aa2fa7486a580 not found: ID does not exist" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.785776 4841 scope.go:117] "RemoveContainer" containerID="bb5867b2bc0b4568bf07e4b1600ffb787dd00385d4e1363265a65efaa7d27ed1" Mar 13 09:32:22 crc kubenswrapper[4841]: E0313 09:32:22.786545 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb5867b2bc0b4568bf07e4b1600ffb787dd00385d4e1363265a65efaa7d27ed1\": container with ID starting with bb5867b2bc0b4568bf07e4b1600ffb787dd00385d4e1363265a65efaa7d27ed1 not found: ID does not exist" containerID="bb5867b2bc0b4568bf07e4b1600ffb787dd00385d4e1363265a65efaa7d27ed1" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.786571 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5867b2bc0b4568bf07e4b1600ffb787dd00385d4e1363265a65efaa7d27ed1"} err="failed to get container status \"bb5867b2bc0b4568bf07e4b1600ffb787dd00385d4e1363265a65efaa7d27ed1\": rpc error: code = NotFound desc = could not find container \"bb5867b2bc0b4568bf07e4b1600ffb787dd00385d4e1363265a65efaa7d27ed1\": container with ID starting with bb5867b2bc0b4568bf07e4b1600ffb787dd00385d4e1363265a65efaa7d27ed1 not found: ID does not exist" Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.807381 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-trm7n"] Mar 13 09:32:22 crc kubenswrapper[4841]: I0313 09:32:22.954441 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 09:32:23 crc kubenswrapper[4841]: I0313 09:32:23.047442 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 09:32:23 crc kubenswrapper[4841]: W0313 09:32:23.093058 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod132b8971_5eed_4201_87b2_0a235592346e.slice/crio-679a261fa0b1a461d821c16d49ac3c585743b163795ef42b8e790cca1a391663 WatchSource:0}: Error finding container 679a261fa0b1a461d821c16d49ac3c585743b163795ef42b8e790cca1a391663: Status 404 returned error can't find the container with id 679a261fa0b1a461d821c16d49ac3c585743b163795ef42b8e790cca1a391663 Mar 13 09:32:23 crc kubenswrapper[4841]: I0313 09:32:23.718052 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"268afc1a-8785-48f7-9299-4d47d14f6ad2","Type":"ContainerStarted","Data":"85e3ab1a2cf9bda2c482e4e0e40c2cbcb50c2b2ea68eb49b9c07f529bdd337d8"} Mar 13 09:32:23 crc kubenswrapper[4841]: I0313 09:32:23.721098 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"132b8971-5eed-4201-87b2-0a235592346e","Type":"ContainerStarted","Data":"679a261fa0b1a461d821c16d49ac3c585743b163795ef42b8e790cca1a391663"} Mar 13 09:32:23 crc kubenswrapper[4841]: I0313 09:32:23.724621 4841 generic.go:334] "Generic (PLEG): container finished" podID="f59c46bc-54c2-4d96-b12e-fb39ff4a7456" containerID="9598417d7c71caf6981889518592c0779ad80e9b50513981a594c0d81f777983" exitCode=0 Mar 13 09:32:23 crc kubenswrapper[4841]: I0313 09:32:23.725175 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" event={"ID":"f59c46bc-54c2-4d96-b12e-fb39ff4a7456","Type":"ContainerDied","Data":"9598417d7c71caf6981889518592c0779ad80e9b50513981a594c0d81f777983"} Mar 13 09:32:23 crc kubenswrapper[4841]: I0313 09:32:23.725219 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" event={"ID":"f59c46bc-54c2-4d96-b12e-fb39ff4a7456","Type":"ContainerStarted","Data":"3dcaec3acaf47a2dcd79b757115fba319ee79a407965c906018caf475b37707c"} Mar 13 09:32:24 crc kubenswrapper[4841]: I0313 09:32:24.010225 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29b494b4-557b-4469-b18f-bef8d24e73b7" path="/var/lib/kubelet/pods/29b494b4-557b-4469-b18f-bef8d24e73b7/volumes" Mar 13 09:32:24 crc kubenswrapper[4841]: I0313 09:32:24.152350 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:24 crc kubenswrapper[4841]: I0313 09:32:24.836495 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"268afc1a-8785-48f7-9299-4d47d14f6ad2","Type":"ContainerStarted","Data":"da2f96cba392b550a0f2b4b3b556d5b9d3cdbed09e4963a7a6daf1732346d106"} Mar 13 09:32:24 crc kubenswrapper[4841]: I0313 09:32:24.877524 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"132b8971-5eed-4201-87b2-0a235592346e","Type":"ContainerStarted","Data":"f998a6e1c5548d2edd021a0366e26a564ba155a63dbe02780135d8de17d732b2"} Mar 13 09:32:24 crc kubenswrapper[4841]: I0313 09:32:24.877570 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"132b8971-5eed-4201-87b2-0a235592346e","Type":"ContainerStarted","Data":"b1a17a47814e62941e6104a2dcd64c39b95f498c5e2a7b1b89cb6f69452b7070"} Mar 13 09:32:24 crc kubenswrapper[4841]: I0313 09:32:24.878660 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 09:32:24 crc kubenswrapper[4841]: I0313 09:32:24.916620 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" event={"ID":"f59c46bc-54c2-4d96-b12e-fb39ff4a7456","Type":"ContainerStarted","Data":"5663b847f2f4a88c65178ec28d4db45c210f4741b15739977833145ce605fe7d"} Mar 13 09:32:24 crc kubenswrapper[4841]: I0313 09:32:24.917745 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:24 crc kubenswrapper[4841]: I0313 09:32:24.923366 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.923346839 podStartE2EDuration="2.923346839s" podCreationTimestamp="2026-03-13 09:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:32:24.912271571 +0000 UTC m=+1227.642171762" watchObservedRunningTime="2026-03-13 09:32:24.923346839 +0000 UTC m=+1227.653247030" Mar 13 09:32:24 crc kubenswrapper[4841]: I0313 09:32:24.958634 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" podStartSLOduration=3.958614867 podStartE2EDuration="3.958614867s" podCreationTimestamp="2026-03-13 09:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:32:24.957600226 +0000 UTC m=+1227.687500417" watchObservedRunningTime="2026-03-13 09:32:24.958614867 +0000 UTC m=+1227.688515058" Mar 13 09:32:24 crc kubenswrapper[4841]: I0313 09:32:24.993465 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 09:32:25 crc kubenswrapper[4841]: I0313 09:32:25.910417 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:32:25 crc kubenswrapper[4841]: I0313 09:32:25.925390 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"268afc1a-8785-48f7-9299-4d47d14f6ad2","Type":"ContainerStarted","Data":"88a82266bb23a00632842b9d58af52def4c796060d9b3bbfd9aebd9d9de63d03"} Mar 13 09:32:25 crc kubenswrapper[4841]: I0313 09:32:25.975574 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.184533703 podStartE2EDuration="4.975553071s" podCreationTimestamp="2026-03-13 09:32:21 +0000 UTC" firstStartedPulling="2026-03-13 09:32:22.966519316 +0000 UTC m=+1225.696419507" lastFinishedPulling="2026-03-13 09:32:23.757538684 +0000 UTC m=+1226.487438875" observedRunningTime="2026-03-13 09:32:25.971220418 +0000 UTC m=+1228.701120629" watchObservedRunningTime="2026-03-13 09:32:25.975553071 +0000 UTC m=+1228.705453262" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.320662 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-798495c9df-7c5cf"] Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.321287 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-798495c9df-7c5cf" podUID="2a0f5823-2dff-4614-974e-7ebdc083a570" containerName="neutron-api" containerID="cri-o://6b7b3f5d819e23d5151c75c043b419ac10c089047b3e246534c2b6c68881ef5a" gracePeriod=30 Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.321678 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-798495c9df-7c5cf" podUID="2a0f5823-2dff-4614-974e-7ebdc083a570" containerName="neutron-httpd" containerID="cri-o://d99831d4df233f3a2e7ac3ef1daebd6d22819dd557782d994774e5f87b3469ff" gracePeriod=30 Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.327590 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.335201 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-798495c9df-7c5cf" podUID="2a0f5823-2dff-4614-974e-7ebdc083a570" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": EOF" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.355802 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6d768747c7-2ssnn"] Mar 13 09:32:26 crc kubenswrapper[4841]: E0313 09:32:26.356156 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29b494b4-557b-4469-b18f-bef8d24e73b7" containerName="dnsmasq-dns" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.356171 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b494b4-557b-4469-b18f-bef8d24e73b7" containerName="dnsmasq-dns" Mar 13 09:32:26 crc kubenswrapper[4841]: E0313 09:32:26.356196 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29b494b4-557b-4469-b18f-bef8d24e73b7" containerName="init" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.356203 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b494b4-557b-4469-b18f-bef8d24e73b7" containerName="init" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.356562 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="29b494b4-557b-4469-b18f-bef8d24e73b7" containerName="dnsmasq-dns" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.357449 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.411452 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.414045 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d768747c7-2ssnn"] Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.444137 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d209e4b8-27eb-4fea-ad65-807001e8638c-public-tls-certs\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.444193 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d209e4b8-27eb-4fea-ad65-807001e8638c-config\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.444233 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clnsd\" (UniqueName: \"kubernetes.io/projected/d209e4b8-27eb-4fea-ad65-807001e8638c-kube-api-access-clnsd\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.444292 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d209e4b8-27eb-4fea-ad65-807001e8638c-combined-ca-bundle\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.444329 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d209e4b8-27eb-4fea-ad65-807001e8638c-internal-tls-certs\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.444369 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d209e4b8-27eb-4fea-ad65-807001e8638c-ovndb-tls-certs\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.444542 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d209e4b8-27eb-4fea-ad65-807001e8638c-httpd-config\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.457949 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5dbfbd46f8-tjjrf" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.529960 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f69977cdb-7mdds"] Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.546283 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clnsd\" (UniqueName: \"kubernetes.io/projected/d209e4b8-27eb-4fea-ad65-807001e8638c-kube-api-access-clnsd\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.546376 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d209e4b8-27eb-4fea-ad65-807001e8638c-combined-ca-bundle\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.546450 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d209e4b8-27eb-4fea-ad65-807001e8638c-internal-tls-certs\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.546522 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d209e4b8-27eb-4fea-ad65-807001e8638c-ovndb-tls-certs\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.546559 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d209e4b8-27eb-4fea-ad65-807001e8638c-httpd-config\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.546644 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d209e4b8-27eb-4fea-ad65-807001e8638c-public-tls-certs\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.546705 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d209e4b8-27eb-4fea-ad65-807001e8638c-config\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.555193 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d209e4b8-27eb-4fea-ad65-807001e8638c-combined-ca-bundle\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.556015 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d209e4b8-27eb-4fea-ad65-807001e8638c-public-tls-certs\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.556339 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d209e4b8-27eb-4fea-ad65-807001e8638c-config\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.560562 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d209e4b8-27eb-4fea-ad65-807001e8638c-internal-tls-certs\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.566983 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d209e4b8-27eb-4fea-ad65-807001e8638c-ovndb-tls-certs\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.572736 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d209e4b8-27eb-4fea-ad65-807001e8638c-httpd-config\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.573325 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clnsd\" (UniqueName: \"kubernetes.io/projected/d209e4b8-27eb-4fea-ad65-807001e8638c-kube-api-access-clnsd\") pod \"neutron-6d768747c7-2ssnn\" (UID: \"d209e4b8-27eb-4fea-ad65-807001e8638c\") " pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.696483 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.934553 4841 generic.go:334] "Generic (PLEG): container finished" podID="2a0f5823-2dff-4614-974e-7ebdc083a570" containerID="d99831d4df233f3a2e7ac3ef1daebd6d22819dd557782d994774e5f87b3469ff" exitCode=0 Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.935686 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="132b8971-5eed-4201-87b2-0a235592346e" containerName="cinder-api-log" containerID="cri-o://b1a17a47814e62941e6104a2dcd64c39b95f498c5e2a7b1b89cb6f69452b7070" gracePeriod=30 Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.934636 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-798495c9df-7c5cf" event={"ID":"2a0f5823-2dff-4614-974e-7ebdc083a570","Type":"ContainerDied","Data":"d99831d4df233f3a2e7ac3ef1daebd6d22819dd557782d994774e5f87b3469ff"} Mar 13 09:32:26 crc kubenswrapper[4841]: I0313 09:32:26.936067 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="132b8971-5eed-4201-87b2-0a235592346e" containerName="cinder-api" containerID="cri-o://f998a6e1c5548d2edd021a0366e26a564ba155a63dbe02780135d8de17d732b2" gracePeriod=30 Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.298822 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.357302 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d768747c7-2ssnn"] Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.546240 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.572378 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-scripts\") pod \"132b8971-5eed-4201-87b2-0a235592346e\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.572471 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/132b8971-5eed-4201-87b2-0a235592346e-etc-machine-id\") pod \"132b8971-5eed-4201-87b2-0a235592346e\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.572510 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-config-data-custom\") pod \"132b8971-5eed-4201-87b2-0a235592346e\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.572566 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-combined-ca-bundle\") pod \"132b8971-5eed-4201-87b2-0a235592346e\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.572700 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74krb\" (UniqueName: \"kubernetes.io/projected/132b8971-5eed-4201-87b2-0a235592346e-kube-api-access-74krb\") pod \"132b8971-5eed-4201-87b2-0a235592346e\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.572767 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/132b8971-5eed-4201-87b2-0a235592346e-logs\") pod \"132b8971-5eed-4201-87b2-0a235592346e\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.572855 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-config-data\") pod \"132b8971-5eed-4201-87b2-0a235592346e\" (UID: \"132b8971-5eed-4201-87b2-0a235592346e\") " Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.574670 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/132b8971-5eed-4201-87b2-0a235592346e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "132b8971-5eed-4201-87b2-0a235592346e" (UID: "132b8971-5eed-4201-87b2-0a235592346e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.575053 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/132b8971-5eed-4201-87b2-0a235592346e-logs" (OuterVolumeSpecName: "logs") pod "132b8971-5eed-4201-87b2-0a235592346e" (UID: "132b8971-5eed-4201-87b2-0a235592346e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.581735 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "132b8971-5eed-4201-87b2-0a235592346e" (UID: "132b8971-5eed-4201-87b2-0a235592346e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.584863 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-scripts" (OuterVolumeSpecName: "scripts") pod "132b8971-5eed-4201-87b2-0a235592346e" (UID: "132b8971-5eed-4201-87b2-0a235592346e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.586472 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/132b8971-5eed-4201-87b2-0a235592346e-kube-api-access-74krb" (OuterVolumeSpecName: "kube-api-access-74krb") pod "132b8971-5eed-4201-87b2-0a235592346e" (UID: "132b8971-5eed-4201-87b2-0a235592346e"). InnerVolumeSpecName "kube-api-access-74krb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.642413 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "132b8971-5eed-4201-87b2-0a235592346e" (UID: "132b8971-5eed-4201-87b2-0a235592346e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.675491 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.675533 4841 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/132b8971-5eed-4201-87b2-0a235592346e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.675549 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.675561 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.675600 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74krb\" (UniqueName: \"kubernetes.io/projected/132b8971-5eed-4201-87b2-0a235592346e-kube-api-access-74krb\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.675613 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/132b8971-5eed-4201-87b2-0a235592346e-logs\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.677385 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-config-data" (OuterVolumeSpecName: "config-data") pod "132b8971-5eed-4201-87b2-0a235592346e" (UID: "132b8971-5eed-4201-87b2-0a235592346e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.778050 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/132b8971-5eed-4201-87b2-0a235592346e-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.947404 4841 generic.go:334] "Generic (PLEG): container finished" podID="132b8971-5eed-4201-87b2-0a235592346e" containerID="f998a6e1c5548d2edd021a0366e26a564ba155a63dbe02780135d8de17d732b2" exitCode=0 Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.947453 4841 generic.go:334] "Generic (PLEG): container finished" podID="132b8971-5eed-4201-87b2-0a235592346e" containerID="b1a17a47814e62941e6104a2dcd64c39b95f498c5e2a7b1b89cb6f69452b7070" exitCode=143 Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.947500 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"132b8971-5eed-4201-87b2-0a235592346e","Type":"ContainerDied","Data":"f998a6e1c5548d2edd021a0366e26a564ba155a63dbe02780135d8de17d732b2"} Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.947545 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"132b8971-5eed-4201-87b2-0a235592346e","Type":"ContainerDied","Data":"b1a17a47814e62941e6104a2dcd64c39b95f498c5e2a7b1b89cb6f69452b7070"} Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.947560 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"132b8971-5eed-4201-87b2-0a235592346e","Type":"ContainerDied","Data":"679a261fa0b1a461d821c16d49ac3c585743b163795ef42b8e790cca1a391663"} Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.947579 4841 scope.go:117] "RemoveContainer" containerID="f998a6e1c5548d2edd021a0366e26a564ba155a63dbe02780135d8de17d732b2" Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.947735 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.970589 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d768747c7-2ssnn" event={"ID":"d209e4b8-27eb-4fea-ad65-807001e8638c","Type":"ContainerStarted","Data":"f16984f55005765d5c3399aba108a18fc0a7dbeba3bcc6188b8f1ef5cf0d9f8e"} Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.970625 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d768747c7-2ssnn" event={"ID":"d209e4b8-27eb-4fea-ad65-807001e8638c","Type":"ContainerStarted","Data":"9925734c2ad44b4ff59fcbb154f64a664ab8819ad9843f66f47f3debd1184bc4"} Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.970634 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d768747c7-2ssnn" event={"ID":"d209e4b8-27eb-4fea-ad65-807001e8638c","Type":"ContainerStarted","Data":"fcd2c07afa46601bc14f8d2b9bac7dc6901006316e26d3d0b2a7e3ad0e94ab4f"} Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.970758 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f69977cdb-7mdds" podUID="ba25d9f9-4136-49e7-9016-a77627808014" containerName="barbican-api-log" containerID="cri-o://3063274d726b3acaccdf0b261116fbc7042e9b251eed4b00e63d2cf2ceb19f9f" gracePeriod=30 Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.970866 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f69977cdb-7mdds" podUID="ba25d9f9-4136-49e7-9016-a77627808014" containerName="barbican-api" containerID="cri-o://4d5adf299cfa5d67f73de4fe6cbdc637423680eeefaa47d2efde069d260da422" gracePeriod=30 Mar 13 09:32:27 crc kubenswrapper[4841]: I0313 09:32:27.971011 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.000587 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6f69977cdb-7mdds" podUID="ba25d9f9-4136-49e7-9016-a77627808014" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": EOF" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.000994 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6f69977cdb-7mdds" podUID="ba25d9f9-4136-49e7-9016-a77627808014" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": EOF" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.019876 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6d768747c7-2ssnn" podStartSLOduration=2.019852578 podStartE2EDuration="2.019852578s" podCreationTimestamp="2026-03-13 09:32:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:32:28.007762998 +0000 UTC m=+1230.737663209" watchObservedRunningTime="2026-03-13 09:32:28.019852578 +0000 UTC m=+1230.749752769" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.031419 4841 scope.go:117] "RemoveContainer" containerID="b1a17a47814e62941e6104a2dcd64c39b95f498c5e2a7b1b89cb6f69452b7070" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.052377 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.066599 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.073146 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 09:32:28 crc kubenswrapper[4841]: E0313 09:32:28.073615 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132b8971-5eed-4201-87b2-0a235592346e" containerName="cinder-api-log" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.073635 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="132b8971-5eed-4201-87b2-0a235592346e" containerName="cinder-api-log" Mar 13 09:32:28 crc kubenswrapper[4841]: E0313 09:32:28.073648 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132b8971-5eed-4201-87b2-0a235592346e" containerName="cinder-api" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.073655 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="132b8971-5eed-4201-87b2-0a235592346e" containerName="cinder-api" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.073822 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="132b8971-5eed-4201-87b2-0a235592346e" containerName="cinder-api" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.073849 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="132b8971-5eed-4201-87b2-0a235592346e" containerName="cinder-api-log" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.074706 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.077839 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.077998 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.078114 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.088298 4841 scope.go:117] "RemoveContainer" containerID="f998a6e1c5548d2edd021a0366e26a564ba155a63dbe02780135d8de17d732b2" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.089348 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88dbe267-3d86-4bcd-8654-79392e0c502d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.089377 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88dbe267-3d86-4bcd-8654-79392e0c502d-scripts\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.089400 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88dbe267-3d86-4bcd-8654-79392e0c502d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.089422 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88dbe267-3d86-4bcd-8654-79392e0c502d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.089446 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88dbe267-3d86-4bcd-8654-79392e0c502d-config-data\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.089483 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvzgk\" (UniqueName: \"kubernetes.io/projected/88dbe267-3d86-4bcd-8654-79392e0c502d-kube-api-access-mvzgk\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.089528 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88dbe267-3d86-4bcd-8654-79392e0c502d-config-data-custom\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.089553 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88dbe267-3d86-4bcd-8654-79392e0c502d-logs\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.089570 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88dbe267-3d86-4bcd-8654-79392e0c502d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: E0313 09:32:28.091874 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f998a6e1c5548d2edd021a0366e26a564ba155a63dbe02780135d8de17d732b2\": container with ID starting with f998a6e1c5548d2edd021a0366e26a564ba155a63dbe02780135d8de17d732b2 not found: ID does not exist" containerID="f998a6e1c5548d2edd021a0366e26a564ba155a63dbe02780135d8de17d732b2" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.091919 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f998a6e1c5548d2edd021a0366e26a564ba155a63dbe02780135d8de17d732b2"} err="failed to get container status \"f998a6e1c5548d2edd021a0366e26a564ba155a63dbe02780135d8de17d732b2\": rpc error: code = NotFound desc = could not find container \"f998a6e1c5548d2edd021a0366e26a564ba155a63dbe02780135d8de17d732b2\": container with ID starting with f998a6e1c5548d2edd021a0366e26a564ba155a63dbe02780135d8de17d732b2 not found: ID does not exist" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.091951 4841 scope.go:117] "RemoveContainer" containerID="b1a17a47814e62941e6104a2dcd64c39b95f498c5e2a7b1b89cb6f69452b7070" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.092768 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 09:32:28 crc kubenswrapper[4841]: E0313 09:32:28.095539 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1a17a47814e62941e6104a2dcd64c39b95f498c5e2a7b1b89cb6f69452b7070\": container with ID starting with b1a17a47814e62941e6104a2dcd64c39b95f498c5e2a7b1b89cb6f69452b7070 not found: ID does not exist" containerID="b1a17a47814e62941e6104a2dcd64c39b95f498c5e2a7b1b89cb6f69452b7070" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.095587 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1a17a47814e62941e6104a2dcd64c39b95f498c5e2a7b1b89cb6f69452b7070"} err="failed to get container status \"b1a17a47814e62941e6104a2dcd64c39b95f498c5e2a7b1b89cb6f69452b7070\": rpc error: code = NotFound desc = could not find container \"b1a17a47814e62941e6104a2dcd64c39b95f498c5e2a7b1b89cb6f69452b7070\": container with ID starting with b1a17a47814e62941e6104a2dcd64c39b95f498c5e2a7b1b89cb6f69452b7070 not found: ID does not exist" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.095616 4841 scope.go:117] "RemoveContainer" containerID="f998a6e1c5548d2edd021a0366e26a564ba155a63dbe02780135d8de17d732b2" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.096009 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f998a6e1c5548d2edd021a0366e26a564ba155a63dbe02780135d8de17d732b2"} err="failed to get container status \"f998a6e1c5548d2edd021a0366e26a564ba155a63dbe02780135d8de17d732b2\": rpc error: code = NotFound desc = could not find container \"f998a6e1c5548d2edd021a0366e26a564ba155a63dbe02780135d8de17d732b2\": container with ID starting with f998a6e1c5548d2edd021a0366e26a564ba155a63dbe02780135d8de17d732b2 not found: ID does not exist" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.096047 4841 scope.go:117] "RemoveContainer" containerID="b1a17a47814e62941e6104a2dcd64c39b95f498c5e2a7b1b89cb6f69452b7070" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.096249 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1a17a47814e62941e6104a2dcd64c39b95f498c5e2a7b1b89cb6f69452b7070"} err="failed to get container status \"b1a17a47814e62941e6104a2dcd64c39b95f498c5e2a7b1b89cb6f69452b7070\": rpc error: code = NotFound desc = could not find container \"b1a17a47814e62941e6104a2dcd64c39b95f498c5e2a7b1b89cb6f69452b7070\": container with ID starting with b1a17a47814e62941e6104a2dcd64c39b95f498c5e2a7b1b89cb6f69452b7070 not found: ID does not exist" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.114549 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-798495c9df-7c5cf" podUID="2a0f5823-2dff-4614-974e-7ebdc083a570" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": dial tcp 10.217.0.158:9696: connect: connection refused" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.191394 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88dbe267-3d86-4bcd-8654-79392e0c502d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.191453 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88dbe267-3d86-4bcd-8654-79392e0c502d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.191484 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88dbe267-3d86-4bcd-8654-79392e0c502d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.191500 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88dbe267-3d86-4bcd-8654-79392e0c502d-config-data\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.191662 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvzgk\" (UniqueName: \"kubernetes.io/projected/88dbe267-3d86-4bcd-8654-79392e0c502d-kube-api-access-mvzgk\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.191810 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88dbe267-3d86-4bcd-8654-79392e0c502d-config-data-custom\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.191869 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88dbe267-3d86-4bcd-8654-79392e0c502d-logs\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.191913 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88dbe267-3d86-4bcd-8654-79392e0c502d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.192092 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88dbe267-3d86-4bcd-8654-79392e0c502d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.192135 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88dbe267-3d86-4bcd-8654-79392e0c502d-scripts\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.192573 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88dbe267-3d86-4bcd-8654-79392e0c502d-logs\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.200398 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88dbe267-3d86-4bcd-8654-79392e0c502d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.200748 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88dbe267-3d86-4bcd-8654-79392e0c502d-scripts\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.202428 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88dbe267-3d86-4bcd-8654-79392e0c502d-config-data\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.202947 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88dbe267-3d86-4bcd-8654-79392e0c502d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.207917 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88dbe267-3d86-4bcd-8654-79392e0c502d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.208364 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88dbe267-3d86-4bcd-8654-79392e0c502d-config-data-custom\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.216704 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvzgk\" (UniqueName: \"kubernetes.io/projected/88dbe267-3d86-4bcd-8654-79392e0c502d-kube-api-access-mvzgk\") pod \"cinder-api-0\" (UID: \"88dbe267-3d86-4bcd-8654-79392e0c502d\") " pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.433698 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.924367 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 09:32:28 crc kubenswrapper[4841]: W0313 09:32:28.930227 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88dbe267_3d86_4bcd_8654_79392e0c502d.slice/crio-83723df90cc8c96fb7d99c109bfa4e036b007596f8559c9ed992216f32d28635 WatchSource:0}: Error finding container 83723df90cc8c96fb7d99c109bfa4e036b007596f8559c9ed992216f32d28635: Status 404 returned error can't find the container with id 83723df90cc8c96fb7d99c109bfa4e036b007596f8559c9ed992216f32d28635 Mar 13 09:32:28 crc kubenswrapper[4841]: I0313 09:32:28.999715 4841 generic.go:334] "Generic (PLEG): container finished" podID="ba25d9f9-4136-49e7-9016-a77627808014" containerID="3063274d726b3acaccdf0b261116fbc7042e9b251eed4b00e63d2cf2ceb19f9f" exitCode=143 Mar 13 09:32:29 crc kubenswrapper[4841]: I0313 09:32:28.999793 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f69977cdb-7mdds" event={"ID":"ba25d9f9-4136-49e7-9016-a77627808014","Type":"ContainerDied","Data":"3063274d726b3acaccdf0b261116fbc7042e9b251eed4b00e63d2cf2ceb19f9f"} Mar 13 09:32:29 crc kubenswrapper[4841]: I0313 09:32:29.002096 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"88dbe267-3d86-4bcd-8654-79392e0c502d","Type":"ContainerStarted","Data":"83723df90cc8c96fb7d99c109bfa4e036b007596f8559c9ed992216f32d28635"} Mar 13 09:32:29 crc kubenswrapper[4841]: I0313 09:32:29.911419 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:32:29 crc kubenswrapper[4841]: I0313 09:32:29.917004 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-internal-tls-certs\") pod \"2a0f5823-2dff-4614-974e-7ebdc083a570\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " Mar 13 09:32:29 crc kubenswrapper[4841]: I0313 09:32:29.917070 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-combined-ca-bundle\") pod \"2a0f5823-2dff-4614-974e-7ebdc083a570\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " Mar 13 09:32:29 crc kubenswrapper[4841]: I0313 09:32:29.917114 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-config\") pod \"2a0f5823-2dff-4614-974e-7ebdc083a570\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " Mar 13 09:32:29 crc kubenswrapper[4841]: I0313 09:32:29.917153 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-httpd-config\") pod \"2a0f5823-2dff-4614-974e-7ebdc083a570\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " Mar 13 09:32:29 crc kubenswrapper[4841]: I0313 09:32:29.917196 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-ovndb-tls-certs\") pod \"2a0f5823-2dff-4614-974e-7ebdc083a570\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " Mar 13 09:32:29 crc kubenswrapper[4841]: I0313 09:32:29.917247 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sbg2\" (UniqueName: \"kubernetes.io/projected/2a0f5823-2dff-4614-974e-7ebdc083a570-kube-api-access-9sbg2\") pod \"2a0f5823-2dff-4614-974e-7ebdc083a570\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " Mar 13 09:32:29 crc kubenswrapper[4841]: I0313 09:32:29.917292 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-public-tls-certs\") pod \"2a0f5823-2dff-4614-974e-7ebdc083a570\" (UID: \"2a0f5823-2dff-4614-974e-7ebdc083a570\") " Mar 13 09:32:29 crc kubenswrapper[4841]: I0313 09:32:29.922221 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2a0f5823-2dff-4614-974e-7ebdc083a570" (UID: "2a0f5823-2dff-4614-974e-7ebdc083a570"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:29 crc kubenswrapper[4841]: I0313 09:32:29.926817 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a0f5823-2dff-4614-974e-7ebdc083a570-kube-api-access-9sbg2" (OuterVolumeSpecName: "kube-api-access-9sbg2") pod "2a0f5823-2dff-4614-974e-7ebdc083a570" (UID: "2a0f5823-2dff-4614-974e-7ebdc083a570"). InnerVolumeSpecName "kube-api-access-9sbg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.000665 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2a0f5823-2dff-4614-974e-7ebdc083a570" (UID: "2a0f5823-2dff-4614-974e-7ebdc083a570"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.003347 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-config" (OuterVolumeSpecName: "config") pod "2a0f5823-2dff-4614-974e-7ebdc083a570" (UID: "2a0f5823-2dff-4614-974e-7ebdc083a570"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.003515 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2a0f5823-2dff-4614-974e-7ebdc083a570" (UID: "2a0f5823-2dff-4614-974e-7ebdc083a570"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.010935 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="132b8971-5eed-4201-87b2-0a235592346e" path="/var/lib/kubelet/pods/132b8971-5eed-4201-87b2-0a235592346e/volumes" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.014857 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"88dbe267-3d86-4bcd-8654-79392e0c502d","Type":"ContainerStarted","Data":"35ad9f0cc2813c2ae56a0be692f3aafa64e8273fc5c423a38f0fac8b103022f0"} Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.018557 4841 generic.go:334] "Generic (PLEG): container finished" podID="2a0f5823-2dff-4614-974e-7ebdc083a570" containerID="6b7b3f5d819e23d5151c75c043b419ac10c089047b3e246534c2b6c68881ef5a" exitCode=0 Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.018694 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.018715 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.018727 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.018736 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sbg2\" (UniqueName: \"kubernetes.io/projected/2a0f5823-2dff-4614-974e-7ebdc083a570-kube-api-access-9sbg2\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.018745 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.019642 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-798495c9df-7c5cf" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.019759 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a0f5823-2dff-4614-974e-7ebdc083a570" (UID: "2a0f5823-2dff-4614-974e-7ebdc083a570"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.019824 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-798495c9df-7c5cf" event={"ID":"2a0f5823-2dff-4614-974e-7ebdc083a570","Type":"ContainerDied","Data":"6b7b3f5d819e23d5151c75c043b419ac10c089047b3e246534c2b6c68881ef5a"} Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.019849 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-798495c9df-7c5cf" event={"ID":"2a0f5823-2dff-4614-974e-7ebdc083a570","Type":"ContainerDied","Data":"52441eb6173774442e65a08ceeeb90ba11936b4e9ddda6a2fb0eb4b9f97a4ff7"} Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.019865 4841 scope.go:117] "RemoveContainer" containerID="d99831d4df233f3a2e7ac3ef1daebd6d22819dd557782d994774e5f87b3469ff" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.020947 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2a0f5823-2dff-4614-974e-7ebdc083a570" (UID: "2a0f5823-2dff-4614-974e-7ebdc083a570"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.052024 4841 scope.go:117] "RemoveContainer" containerID="6b7b3f5d819e23d5151c75c043b419ac10c089047b3e246534c2b6c68881ef5a" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.070610 4841 scope.go:117] "RemoveContainer" containerID="d99831d4df233f3a2e7ac3ef1daebd6d22819dd557782d994774e5f87b3469ff" Mar 13 09:32:30 crc kubenswrapper[4841]: E0313 09:32:30.071001 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d99831d4df233f3a2e7ac3ef1daebd6d22819dd557782d994774e5f87b3469ff\": container with ID starting with d99831d4df233f3a2e7ac3ef1daebd6d22819dd557782d994774e5f87b3469ff not found: ID does not exist" containerID="d99831d4df233f3a2e7ac3ef1daebd6d22819dd557782d994774e5f87b3469ff" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.071034 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d99831d4df233f3a2e7ac3ef1daebd6d22819dd557782d994774e5f87b3469ff"} err="failed to get container status \"d99831d4df233f3a2e7ac3ef1daebd6d22819dd557782d994774e5f87b3469ff\": rpc error: code = NotFound desc = could not find container \"d99831d4df233f3a2e7ac3ef1daebd6d22819dd557782d994774e5f87b3469ff\": container with ID starting with d99831d4df233f3a2e7ac3ef1daebd6d22819dd557782d994774e5f87b3469ff not found: ID does not exist" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.071058 4841 scope.go:117] "RemoveContainer" containerID="6b7b3f5d819e23d5151c75c043b419ac10c089047b3e246534c2b6c68881ef5a" Mar 13 09:32:30 crc kubenswrapper[4841]: E0313 09:32:30.071472 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b7b3f5d819e23d5151c75c043b419ac10c089047b3e246534c2b6c68881ef5a\": container with ID starting with 6b7b3f5d819e23d5151c75c043b419ac10c089047b3e246534c2b6c68881ef5a not found: ID does not exist" containerID="6b7b3f5d819e23d5151c75c043b419ac10c089047b3e246534c2b6c68881ef5a" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.071504 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b7b3f5d819e23d5151c75c043b419ac10c089047b3e246534c2b6c68881ef5a"} err="failed to get container status \"6b7b3f5d819e23d5151c75c043b419ac10c089047b3e246534c2b6c68881ef5a\": rpc error: code = NotFound desc = could not find container \"6b7b3f5d819e23d5151c75c043b419ac10c089047b3e246534c2b6c68881ef5a\": container with ID starting with 6b7b3f5d819e23d5151c75c043b419ac10c089047b3e246534c2b6c68881ef5a not found: ID does not exist" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.120030 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.120280 4841 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a0f5823-2dff-4614-974e-7ebdc083a570-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.377113 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-798495c9df-7c5cf"] Mar 13 09:32:30 crc kubenswrapper[4841]: I0313 09:32:30.388851 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-798495c9df-7c5cf"] Mar 13 09:32:31 crc kubenswrapper[4841]: I0313 09:32:31.042587 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"88dbe267-3d86-4bcd-8654-79392e0c502d","Type":"ContainerStarted","Data":"73b41cef81437cfca28c969e5026323428f536aab9bdd79ebd3984f8ebf4fe15"} Mar 13 09:32:31 crc kubenswrapper[4841]: I0313 09:32:31.043100 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 09:32:31 crc kubenswrapper[4841]: I0313 09:32:31.088346 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.08831838 podStartE2EDuration="3.08831838s" podCreationTimestamp="2026-03-13 09:32:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:32:31.071414493 +0000 UTC m=+1233.801314724" watchObservedRunningTime="2026-03-13 09:32:31.08831838 +0000 UTC m=+1233.818218611" Mar 13 09:32:32 crc kubenswrapper[4841]: I0313 09:32:32.010675 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a0f5823-2dff-4614-974e-7ebdc083a570" path="/var/lib/kubelet/pods/2a0f5823-2dff-4614-974e-7ebdc083a570/volumes" Mar 13 09:32:32 crc kubenswrapper[4841]: I0313 09:32:32.366888 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:32 crc kubenswrapper[4841]: I0313 09:32:32.419259 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-9svmt"] Mar 13 09:32:32 crc kubenswrapper[4841]: I0313 09:32:32.419550 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-9svmt" podUID="f24c7226-6207-4858-8369-e1496280d721" containerName="dnsmasq-dns" containerID="cri-o://a568b41d415d7d2d78a4d3d15e9ece9c4cbf418823c37217498923afa94cb0d3" gracePeriod=10 Mar 13 09:32:32 crc kubenswrapper[4841]: I0313 09:32:32.459675 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f69977cdb-7mdds" podUID="ba25d9f9-4136-49e7-9016-a77627808014" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:34244->10.217.0.165:9311: read: connection reset by peer" Mar 13 09:32:32 crc kubenswrapper[4841]: I0313 09:32:32.459728 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f69977cdb-7mdds" podUID="ba25d9f9-4136-49e7-9016-a77627808014" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:34252->10.217.0.165:9311: read: connection reset by peer" Mar 13 09:32:32 crc kubenswrapper[4841]: I0313 09:32:32.629562 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 09:32:32 crc kubenswrapper[4841]: I0313 09:32:32.725120 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.010103 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.061230 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.065752 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.067009 4841 generic.go:334] "Generic (PLEG): container finished" podID="f24c7226-6207-4858-8369-e1496280d721" containerID="a568b41d415d7d2d78a4d3d15e9ece9c4cbf418823c37217498923afa94cb0d3" exitCode=0 Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.067087 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-9svmt" event={"ID":"f24c7226-6207-4858-8369-e1496280d721","Type":"ContainerDied","Data":"a568b41d415d7d2d78a4d3d15e9ece9c4cbf418823c37217498923afa94cb0d3"} Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.067130 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-9svmt" event={"ID":"f24c7226-6207-4858-8369-e1496280d721","Type":"ContainerDied","Data":"5ed1e5018d8c41ac41624d4bd87f984ce55976c1f718baea73d84c1c6b2ae316"} Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.067151 4841 scope.go:117] "RemoveContainer" containerID="a568b41d415d7d2d78a4d3d15e9ece9c4cbf418823c37217498923afa94cb0d3" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.070795 4841 generic.go:334] "Generic (PLEG): container finished" podID="ba25d9f9-4136-49e7-9016-a77627808014" containerID="4d5adf299cfa5d67f73de4fe6cbdc637423680eeefaa47d2efde069d260da422" exitCode=0 Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.071085 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f69977cdb-7mdds" event={"ID":"ba25d9f9-4136-49e7-9016-a77627808014","Type":"ContainerDied","Data":"4d5adf299cfa5d67f73de4fe6cbdc637423680eeefaa47d2efde069d260da422"} Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.071167 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f69977cdb-7mdds" event={"ID":"ba25d9f9-4136-49e7-9016-a77627808014","Type":"ContainerDied","Data":"54edfd00249a2d6d550c8cf3999058733f0c01a10af509b856b2573c319cefd7"} Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.071185 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54edfd00249a2d6d550c8cf3999058733f0c01a10af509b856b2573c319cefd7" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.071189 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="268afc1a-8785-48f7-9299-4d47d14f6ad2" containerName="probe" containerID="cri-o://88a82266bb23a00632842b9d58af52def4c796060d9b3bbfd9aebd9d9de63d03" gracePeriod=30 Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.071146 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="268afc1a-8785-48f7-9299-4d47d14f6ad2" containerName="cinder-scheduler" containerID="cri-o://da2f96cba392b550a0f2b4b3b556d5b9d3cdbed09e4963a7a6daf1732346d106" gracePeriod=30 Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.071535 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.138841 4841 scope.go:117] "RemoveContainer" containerID="9eb34aa0106d0f8b6c5ef4b6d9becaaaf41a863a9f963b82bde7b49296235814" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.178046 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba25d9f9-4136-49e7-9016-a77627808014-config-data-custom\") pod \"ba25d9f9-4136-49e7-9016-a77627808014\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.179039 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba25d9f9-4136-49e7-9016-a77627808014-logs\") pod \"ba25d9f9-4136-49e7-9016-a77627808014\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.178370 4841 scope.go:117] "RemoveContainer" containerID="a568b41d415d7d2d78a4d3d15e9ece9c4cbf418823c37217498923afa94cb0d3" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.179070 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9g84\" (UniqueName: \"kubernetes.io/projected/ba25d9f9-4136-49e7-9016-a77627808014-kube-api-access-m9g84\") pod \"ba25d9f9-4136-49e7-9016-a77627808014\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.179178 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msmf4\" (UniqueName: \"kubernetes.io/projected/f24c7226-6207-4858-8369-e1496280d721-kube-api-access-msmf4\") pod \"f24c7226-6207-4858-8369-e1496280d721\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.179252 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-dns-svc\") pod \"f24c7226-6207-4858-8369-e1496280d721\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.179409 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-ovsdbserver-sb\") pod \"f24c7226-6207-4858-8369-e1496280d721\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.179501 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba25d9f9-4136-49e7-9016-a77627808014-combined-ca-bundle\") pod \"ba25d9f9-4136-49e7-9016-a77627808014\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.179512 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba25d9f9-4136-49e7-9016-a77627808014-logs" (OuterVolumeSpecName: "logs") pod "ba25d9f9-4136-49e7-9016-a77627808014" (UID: "ba25d9f9-4136-49e7-9016-a77627808014"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.179524 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-dns-swift-storage-0\") pod \"f24c7226-6207-4858-8369-e1496280d721\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.179594 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba25d9f9-4136-49e7-9016-a77627808014-config-data\") pod \"ba25d9f9-4136-49e7-9016-a77627808014\" (UID: \"ba25d9f9-4136-49e7-9016-a77627808014\") " Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.179624 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-ovsdbserver-nb\") pod \"f24c7226-6207-4858-8369-e1496280d721\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.179691 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-config\") pod \"f24c7226-6207-4858-8369-e1496280d721\" (UID: \"f24c7226-6207-4858-8369-e1496280d721\") " Mar 13 09:32:33 crc kubenswrapper[4841]: E0313 09:32:33.179765 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a568b41d415d7d2d78a4d3d15e9ece9c4cbf418823c37217498923afa94cb0d3\": container with ID starting with a568b41d415d7d2d78a4d3d15e9ece9c4cbf418823c37217498923afa94cb0d3 not found: ID does not exist" containerID="a568b41d415d7d2d78a4d3d15e9ece9c4cbf418823c37217498923afa94cb0d3" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.179824 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a568b41d415d7d2d78a4d3d15e9ece9c4cbf418823c37217498923afa94cb0d3"} err="failed to get container status \"a568b41d415d7d2d78a4d3d15e9ece9c4cbf418823c37217498923afa94cb0d3\": rpc error: code = NotFound desc = could not find container \"a568b41d415d7d2d78a4d3d15e9ece9c4cbf418823c37217498923afa94cb0d3\": container with ID starting with a568b41d415d7d2d78a4d3d15e9ece9c4cbf418823c37217498923afa94cb0d3 not found: ID does not exist" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.179854 4841 scope.go:117] "RemoveContainer" containerID="9eb34aa0106d0f8b6c5ef4b6d9becaaaf41a863a9f963b82bde7b49296235814" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.180598 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba25d9f9-4136-49e7-9016-a77627808014-logs\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:33 crc kubenswrapper[4841]: E0313 09:32:33.182091 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eb34aa0106d0f8b6c5ef4b6d9becaaaf41a863a9f963b82bde7b49296235814\": container with ID starting with 9eb34aa0106d0f8b6c5ef4b6d9becaaaf41a863a9f963b82bde7b49296235814 not found: ID does not exist" containerID="9eb34aa0106d0f8b6c5ef4b6d9becaaaf41a863a9f963b82bde7b49296235814" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.182128 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb34aa0106d0f8b6c5ef4b6d9becaaaf41a863a9f963b82bde7b49296235814"} err="failed to get container status \"9eb34aa0106d0f8b6c5ef4b6d9becaaaf41a863a9f963b82bde7b49296235814\": rpc error: code = NotFound desc = could not find container \"9eb34aa0106d0f8b6c5ef4b6d9becaaaf41a863a9f963b82bde7b49296235814\": container with ID starting with 9eb34aa0106d0f8b6c5ef4b6d9becaaaf41a863a9f963b82bde7b49296235814 not found: ID does not exist" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.184373 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba25d9f9-4136-49e7-9016-a77627808014-kube-api-access-m9g84" (OuterVolumeSpecName: "kube-api-access-m9g84") pod "ba25d9f9-4136-49e7-9016-a77627808014" (UID: "ba25d9f9-4136-49e7-9016-a77627808014"). InnerVolumeSpecName "kube-api-access-m9g84". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.187440 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f24c7226-6207-4858-8369-e1496280d721-kube-api-access-msmf4" (OuterVolumeSpecName: "kube-api-access-msmf4") pod "f24c7226-6207-4858-8369-e1496280d721" (UID: "f24c7226-6207-4858-8369-e1496280d721"). InnerVolumeSpecName "kube-api-access-msmf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.189134 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba25d9f9-4136-49e7-9016-a77627808014-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ba25d9f9-4136-49e7-9016-a77627808014" (UID: "ba25d9f9-4136-49e7-9016-a77627808014"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.260960 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba25d9f9-4136-49e7-9016-a77627808014-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba25d9f9-4136-49e7-9016-a77627808014" (UID: "ba25d9f9-4136-49e7-9016-a77627808014"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.264422 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba25d9f9-4136-49e7-9016-a77627808014-config-data" (OuterVolumeSpecName: "config-data") pod "ba25d9f9-4136-49e7-9016-a77627808014" (UID: "ba25d9f9-4136-49e7-9016-a77627808014"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.265983 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f24c7226-6207-4858-8369-e1496280d721" (UID: "f24c7226-6207-4858-8369-e1496280d721"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.268835 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f24c7226-6207-4858-8369-e1496280d721" (UID: "f24c7226-6207-4858-8369-e1496280d721"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.273236 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-794cb978db-w646s"] Mar 13 09:32:33 crc kubenswrapper[4841]: E0313 09:32:33.273625 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba25d9f9-4136-49e7-9016-a77627808014" containerName="barbican-api" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.273641 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba25d9f9-4136-49e7-9016-a77627808014" containerName="barbican-api" Mar 13 09:32:33 crc kubenswrapper[4841]: E0313 09:32:33.273653 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0f5823-2dff-4614-974e-7ebdc083a570" containerName="neutron-httpd" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.273659 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0f5823-2dff-4614-974e-7ebdc083a570" containerName="neutron-httpd" Mar 13 09:32:33 crc kubenswrapper[4841]: E0313 09:32:33.273666 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f24c7226-6207-4858-8369-e1496280d721" containerName="dnsmasq-dns" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.273672 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24c7226-6207-4858-8369-e1496280d721" containerName="dnsmasq-dns" Mar 13 09:32:33 crc kubenswrapper[4841]: E0313 09:32:33.273685 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0f5823-2dff-4614-974e-7ebdc083a570" containerName="neutron-api" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.273691 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0f5823-2dff-4614-974e-7ebdc083a570" containerName="neutron-api" Mar 13 09:32:33 crc kubenswrapper[4841]: E0313 09:32:33.273704 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f24c7226-6207-4858-8369-e1496280d721" containerName="init" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.273710 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24c7226-6207-4858-8369-e1496280d721" containerName="init" Mar 13 09:32:33 crc kubenswrapper[4841]: E0313 09:32:33.273722 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba25d9f9-4136-49e7-9016-a77627808014" containerName="barbican-api-log" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.273728 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba25d9f9-4136-49e7-9016-a77627808014" containerName="barbican-api-log" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.273889 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba25d9f9-4136-49e7-9016-a77627808014" containerName="barbican-api-log" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.273904 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba25d9f9-4136-49e7-9016-a77627808014" containerName="barbican-api" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.273919 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f24c7226-6207-4858-8369-e1496280d721" containerName="dnsmasq-dns" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.273936 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a0f5823-2dff-4614-974e-7ebdc083a570" containerName="neutron-api" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.273948 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a0f5823-2dff-4614-974e-7ebdc083a570" containerName="neutron-httpd" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.278323 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.280924 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-config" (OuterVolumeSpecName: "config") pod "f24c7226-6207-4858-8369-e1496280d721" (UID: "f24c7226-6207-4858-8369-e1496280d721"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.282011 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msmf4\" (UniqueName: \"kubernetes.io/projected/f24c7226-6207-4858-8369-e1496280d721-kube-api-access-msmf4\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.282028 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.282039 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.282048 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba25d9f9-4136-49e7-9016-a77627808014-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.282056 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba25d9f9-4136-49e7-9016-a77627808014-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.282064 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.282072 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba25d9f9-4136-49e7-9016-a77627808014-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.282080 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9g84\" (UniqueName: \"kubernetes.io/projected/ba25d9f9-4136-49e7-9016-a77627808014-kube-api-access-m9g84\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.283095 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f24c7226-6207-4858-8369-e1496280d721" (UID: "f24c7226-6207-4858-8369-e1496280d721"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.290518 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f24c7226-6207-4858-8369-e1496280d721" (UID: "f24c7226-6207-4858-8369-e1496280d721"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.305757 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-794cb978db-w646s"] Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.383375 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e4f623-6788-4651-95dd-d4fdab2d2b37-combined-ca-bundle\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.383544 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e4f623-6788-4651-95dd-d4fdab2d2b37-public-tls-certs\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.383582 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e4f623-6788-4651-95dd-d4fdab2d2b37-internal-tls-certs\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.383644 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkjsc\" (UniqueName: \"kubernetes.io/projected/b4e4f623-6788-4651-95dd-d4fdab2d2b37-kube-api-access-gkjsc\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.383698 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4e4f623-6788-4651-95dd-d4fdab2d2b37-logs\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.383716 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e4f623-6788-4651-95dd-d4fdab2d2b37-config-data\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.383766 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4e4f623-6788-4651-95dd-d4fdab2d2b37-scripts\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.383838 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.383887 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f24c7226-6207-4858-8369-e1496280d721-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.485932 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e4f623-6788-4651-95dd-d4fdab2d2b37-public-tls-certs\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.486997 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e4f623-6788-4651-95dd-d4fdab2d2b37-internal-tls-certs\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.487680 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkjsc\" (UniqueName: \"kubernetes.io/projected/b4e4f623-6788-4651-95dd-d4fdab2d2b37-kube-api-access-gkjsc\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.488519 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4e4f623-6788-4651-95dd-d4fdab2d2b37-logs\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.489131 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4e4f623-6788-4651-95dd-d4fdab2d2b37-logs\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.489863 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e4f623-6788-4651-95dd-d4fdab2d2b37-config-data\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.490600 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4e4f623-6788-4651-95dd-d4fdab2d2b37-scripts\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.490625 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e4f623-6788-4651-95dd-d4fdab2d2b37-internal-tls-certs\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.490730 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e4f623-6788-4651-95dd-d4fdab2d2b37-combined-ca-bundle\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.494988 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4e4f623-6788-4651-95dd-d4fdab2d2b37-scripts\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.495124 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e4f623-6788-4651-95dd-d4fdab2d2b37-config-data\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.496663 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e4f623-6788-4651-95dd-d4fdab2d2b37-combined-ca-bundle\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.497084 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e4f623-6788-4651-95dd-d4fdab2d2b37-public-tls-certs\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.512311 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkjsc\" (UniqueName: \"kubernetes.io/projected/b4e4f623-6788-4651-95dd-d4fdab2d2b37-kube-api-access-gkjsc\") pod \"placement-794cb978db-w646s\" (UID: \"b4e4f623-6788-4651-95dd-d4fdab2d2b37\") " pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:33 crc kubenswrapper[4841]: I0313 09:32:33.603204 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:34 crc kubenswrapper[4841]: I0313 09:32:34.081569 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-9svmt" Mar 13 09:32:34 crc kubenswrapper[4841]: I0313 09:32:34.083890 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-794cb978db-w646s"] Mar 13 09:32:34 crc kubenswrapper[4841]: I0313 09:32:34.085620 4841 generic.go:334] "Generic (PLEG): container finished" podID="268afc1a-8785-48f7-9299-4d47d14f6ad2" containerID="88a82266bb23a00632842b9d58af52def4c796060d9b3bbfd9aebd9d9de63d03" exitCode=0 Mar 13 09:32:34 crc kubenswrapper[4841]: I0313 09:32:34.085648 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"268afc1a-8785-48f7-9299-4d47d14f6ad2","Type":"ContainerDied","Data":"88a82266bb23a00632842b9d58af52def4c796060d9b3bbfd9aebd9d9de63d03"} Mar 13 09:32:34 crc kubenswrapper[4841]: I0313 09:32:34.085748 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f69977cdb-7mdds" Mar 13 09:32:34 crc kubenswrapper[4841]: W0313 09:32:34.088623 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4e4f623_6788_4651_95dd_d4fdab2d2b37.slice/crio-ffc7601ce33dab31e56ef1bca3be131f8b240d1f529ded14e0aa8b0bae28e486 WatchSource:0}: Error finding container ffc7601ce33dab31e56ef1bca3be131f8b240d1f529ded14e0aa8b0bae28e486: Status 404 returned error can't find the container with id ffc7601ce33dab31e56ef1bca3be131f8b240d1f529ded14e0aa8b0bae28e486 Mar 13 09:32:34 crc kubenswrapper[4841]: I0313 09:32:34.112126 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-9svmt"] Mar 13 09:32:34 crc kubenswrapper[4841]: I0313 09:32:34.119999 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-9svmt"] Mar 13 09:32:34 crc kubenswrapper[4841]: I0313 09:32:34.185824 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f69977cdb-7mdds"] Mar 13 09:32:34 crc kubenswrapper[4841]: I0313 09:32:34.194692 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6f69977cdb-7mdds"] Mar 13 09:32:34 crc kubenswrapper[4841]: I0313 09:32:34.407716 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:32:34 crc kubenswrapper[4841]: I0313 09:32:34.407780 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:32:34 crc kubenswrapper[4841]: I0313 09:32:34.407824 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:32:34 crc kubenswrapper[4841]: I0313 09:32:34.408567 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8dc018ca0f90a95ed2ddf32ba76ace2f8d1b621b17b9ee14fcc045e6a5af19f7"} pod="openshift-machine-config-operator/machine-config-daemon-h227v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 09:32:34 crc kubenswrapper[4841]: I0313 09:32:34.408623 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" containerID="cri-o://8dc018ca0f90a95ed2ddf32ba76ace2f8d1b621b17b9ee14fcc045e6a5af19f7" gracePeriod=600 Mar 13 09:32:34 crc kubenswrapper[4841]: E0313 09:32:34.523293 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode49b836b_f6cf_4cee_b1be_6bd7864fb7f2.slice/crio-conmon-8dc018ca0f90a95ed2ddf32ba76ace2f8d1b621b17b9ee14fcc045e6a5af19f7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode49b836b_f6cf_4cee_b1be_6bd7864fb7f2.slice/crio-8dc018ca0f90a95ed2ddf32ba76ace2f8d1b621b17b9ee14fcc045e6a5af19f7.scope\": RecentStats: unable to find data in memory cache]" Mar 13 09:32:35 crc kubenswrapper[4841]: I0313 09:32:35.097097 4841 generic.go:334] "Generic (PLEG): container finished" podID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerID="8dc018ca0f90a95ed2ddf32ba76ace2f8d1b621b17b9ee14fcc045e6a5af19f7" exitCode=0 Mar 13 09:32:35 crc kubenswrapper[4841]: I0313 09:32:35.097186 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerDied","Data":"8dc018ca0f90a95ed2ddf32ba76ace2f8d1b621b17b9ee14fcc045e6a5af19f7"} Mar 13 09:32:35 crc kubenswrapper[4841]: I0313 09:32:35.097452 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"775ebab3cf7b982d36c777cc0cdaea2069ca71dd3ee3f41b99a1b2505417aae0"} Mar 13 09:32:35 crc kubenswrapper[4841]: I0313 09:32:35.097478 4841 scope.go:117] "RemoveContainer" containerID="6491cd972e8f18473231b5b2215720345c96ab2a0337886960a5d983df3b0e59" Mar 13 09:32:35 crc kubenswrapper[4841]: I0313 09:32:35.101209 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-794cb978db-w646s" event={"ID":"b4e4f623-6788-4651-95dd-d4fdab2d2b37","Type":"ContainerStarted","Data":"f426f9fbd6653f81a7fd2a4c3831ae2c3050ba2a0422cb7394515684dec21b13"} Mar 13 09:32:35 crc kubenswrapper[4841]: I0313 09:32:35.101244 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-794cb978db-w646s" event={"ID":"b4e4f623-6788-4651-95dd-d4fdab2d2b37","Type":"ContainerStarted","Data":"92660fe2fc89eec691dfbc4c71534f8163caa85fc4622d90f2a0c01dc6ba0659"} Mar 13 09:32:35 crc kubenswrapper[4841]: I0313 09:32:35.101256 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-794cb978db-w646s" event={"ID":"b4e4f623-6788-4651-95dd-d4fdab2d2b37","Type":"ContainerStarted","Data":"ffc7601ce33dab31e56ef1bca3be131f8b240d1f529ded14e0aa8b0bae28e486"} Mar 13 09:32:35 crc kubenswrapper[4841]: I0313 09:32:35.101396 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:35 crc kubenswrapper[4841]: I0313 09:32:35.148208 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-794cb978db-w646s" podStartSLOduration=2.148186684 podStartE2EDuration="2.148186684s" podCreationTimestamp="2026-03-13 09:32:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:32:35.143207609 +0000 UTC m=+1237.873107800" watchObservedRunningTime="2026-03-13 09:32:35.148186684 +0000 UTC m=+1237.878086885" Mar 13 09:32:36 crc kubenswrapper[4841]: I0313 09:32:36.009024 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba25d9f9-4136-49e7-9016-a77627808014" path="/var/lib/kubelet/pods/ba25d9f9-4136-49e7-9016-a77627808014/volumes" Mar 13 09:32:36 crc kubenswrapper[4841]: I0313 09:32:36.010594 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f24c7226-6207-4858-8369-e1496280d721" path="/var/lib/kubelet/pods/f24c7226-6207-4858-8369-e1496280d721/volumes" Mar 13 09:32:36 crc kubenswrapper[4841]: I0313 09:32:36.113568 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-794cb978db-w646s" Mar 13 09:32:36 crc kubenswrapper[4841]: I0313 09:32:36.602682 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7b87d5dbb8-7ppv5" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.046910 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.129114 4841 generic.go:334] "Generic (PLEG): container finished" podID="268afc1a-8785-48f7-9299-4d47d14f6ad2" containerID="da2f96cba392b550a0f2b4b3b556d5b9d3cdbed09e4963a7a6daf1732346d106" exitCode=0 Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.129199 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.129211 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"268afc1a-8785-48f7-9299-4d47d14f6ad2","Type":"ContainerDied","Data":"da2f96cba392b550a0f2b4b3b556d5b9d3cdbed09e4963a7a6daf1732346d106"} Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.129274 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"268afc1a-8785-48f7-9299-4d47d14f6ad2","Type":"ContainerDied","Data":"85e3ab1a2cf9bda2c482e4e0e40c2cbcb50c2b2ea68eb49b9c07f529bdd337d8"} Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.129294 4841 scope.go:117] "RemoveContainer" containerID="88a82266bb23a00632842b9d58af52def4c796060d9b3bbfd9aebd9d9de63d03" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.163234 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkw5l\" (UniqueName: \"kubernetes.io/projected/268afc1a-8785-48f7-9299-4d47d14f6ad2-kube-api-access-vkw5l\") pod \"268afc1a-8785-48f7-9299-4d47d14f6ad2\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.163477 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-config-data-custom\") pod \"268afc1a-8785-48f7-9299-4d47d14f6ad2\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.164853 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-scripts\") pod \"268afc1a-8785-48f7-9299-4d47d14f6ad2\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.165061 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-config-data\") pod \"268afc1a-8785-48f7-9299-4d47d14f6ad2\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.165094 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-combined-ca-bundle\") pod \"268afc1a-8785-48f7-9299-4d47d14f6ad2\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.165161 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/268afc1a-8785-48f7-9299-4d47d14f6ad2-etc-machine-id\") pod \"268afc1a-8785-48f7-9299-4d47d14f6ad2\" (UID: \"268afc1a-8785-48f7-9299-4d47d14f6ad2\") " Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.165413 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/268afc1a-8785-48f7-9299-4d47d14f6ad2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "268afc1a-8785-48f7-9299-4d47d14f6ad2" (UID: "268afc1a-8785-48f7-9299-4d47d14f6ad2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.166201 4841 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/268afc1a-8785-48f7-9299-4d47d14f6ad2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.168405 4841 scope.go:117] "RemoveContainer" containerID="da2f96cba392b550a0f2b4b3b556d5b9d3cdbed09e4963a7a6daf1732346d106" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.169417 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-scripts" (OuterVolumeSpecName: "scripts") pod "268afc1a-8785-48f7-9299-4d47d14f6ad2" (UID: "268afc1a-8785-48f7-9299-4d47d14f6ad2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.174250 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "268afc1a-8785-48f7-9299-4d47d14f6ad2" (UID: "268afc1a-8785-48f7-9299-4d47d14f6ad2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.176494 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268afc1a-8785-48f7-9299-4d47d14f6ad2-kube-api-access-vkw5l" (OuterVolumeSpecName: "kube-api-access-vkw5l") pod "268afc1a-8785-48f7-9299-4d47d14f6ad2" (UID: "268afc1a-8785-48f7-9299-4d47d14f6ad2"). InnerVolumeSpecName "kube-api-access-vkw5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.221328 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "268afc1a-8785-48f7-9299-4d47d14f6ad2" (UID: "268afc1a-8785-48f7-9299-4d47d14f6ad2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.267395 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.267434 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkw5l\" (UniqueName: \"kubernetes.io/projected/268afc1a-8785-48f7-9299-4d47d14f6ad2-kube-api-access-vkw5l\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.267454 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.267466 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.282386 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-config-data" (OuterVolumeSpecName: "config-data") pod "268afc1a-8785-48f7-9299-4d47d14f6ad2" (UID: "268afc1a-8785-48f7-9299-4d47d14f6ad2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.295489 4841 scope.go:117] "RemoveContainer" containerID="88a82266bb23a00632842b9d58af52def4c796060d9b3bbfd9aebd9d9de63d03" Mar 13 09:32:37 crc kubenswrapper[4841]: E0313 09:32:37.296248 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88a82266bb23a00632842b9d58af52def4c796060d9b3bbfd9aebd9d9de63d03\": container with ID starting with 88a82266bb23a00632842b9d58af52def4c796060d9b3bbfd9aebd9d9de63d03 not found: ID does not exist" containerID="88a82266bb23a00632842b9d58af52def4c796060d9b3bbfd9aebd9d9de63d03" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.296311 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a82266bb23a00632842b9d58af52def4c796060d9b3bbfd9aebd9d9de63d03"} err="failed to get container status \"88a82266bb23a00632842b9d58af52def4c796060d9b3bbfd9aebd9d9de63d03\": rpc error: code = NotFound desc = could not find container \"88a82266bb23a00632842b9d58af52def4c796060d9b3bbfd9aebd9d9de63d03\": container with ID starting with 88a82266bb23a00632842b9d58af52def4c796060d9b3bbfd9aebd9d9de63d03 not found: ID does not exist" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.296344 4841 scope.go:117] "RemoveContainer" containerID="da2f96cba392b550a0f2b4b3b556d5b9d3cdbed09e4963a7a6daf1732346d106" Mar 13 09:32:37 crc kubenswrapper[4841]: E0313 09:32:37.306934 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da2f96cba392b550a0f2b4b3b556d5b9d3cdbed09e4963a7a6daf1732346d106\": container with ID starting with da2f96cba392b550a0f2b4b3b556d5b9d3cdbed09e4963a7a6daf1732346d106 not found: ID does not exist" containerID="da2f96cba392b550a0f2b4b3b556d5b9d3cdbed09e4963a7a6daf1732346d106" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.306974 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da2f96cba392b550a0f2b4b3b556d5b9d3cdbed09e4963a7a6daf1732346d106"} err="failed to get container status \"da2f96cba392b550a0f2b4b3b556d5b9d3cdbed09e4963a7a6daf1732346d106\": rpc error: code = NotFound desc = could not find container \"da2f96cba392b550a0f2b4b3b556d5b9d3cdbed09e4963a7a6daf1732346d106\": container with ID starting with da2f96cba392b550a0f2b4b3b556d5b9d3cdbed09e4963a7a6daf1732346d106 not found: ID does not exist" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.369191 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/268afc1a-8785-48f7-9299-4d47d14f6ad2-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.481451 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.492895 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.502090 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 09:32:37 crc kubenswrapper[4841]: E0313 09:32:37.502560 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268afc1a-8785-48f7-9299-4d47d14f6ad2" containerName="probe" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.502577 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="268afc1a-8785-48f7-9299-4d47d14f6ad2" containerName="probe" Mar 13 09:32:37 crc kubenswrapper[4841]: E0313 09:32:37.502623 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268afc1a-8785-48f7-9299-4d47d14f6ad2" containerName="cinder-scheduler" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.502630 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="268afc1a-8785-48f7-9299-4d47d14f6ad2" containerName="cinder-scheduler" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.502866 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="268afc1a-8785-48f7-9299-4d47d14f6ad2" containerName="cinder-scheduler" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.502879 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="268afc1a-8785-48f7-9299-4d47d14f6ad2" containerName="probe" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.505076 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.508779 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.513589 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.675052 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626bc701-ba99-4de7-a2f9-b42eb150a783-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"626bc701-ba99-4de7-a2f9-b42eb150a783\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.675097 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626bc701-ba99-4de7-a2f9-b42eb150a783-config-data\") pod \"cinder-scheduler-0\" (UID: \"626bc701-ba99-4de7-a2f9-b42eb150a783\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.675119 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/626bc701-ba99-4de7-a2f9-b42eb150a783-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"626bc701-ba99-4de7-a2f9-b42eb150a783\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.675171 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/626bc701-ba99-4de7-a2f9-b42eb150a783-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"626bc701-ba99-4de7-a2f9-b42eb150a783\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.675192 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc27j\" (UniqueName: \"kubernetes.io/projected/626bc701-ba99-4de7-a2f9-b42eb150a783-kube-api-access-bc27j\") pod \"cinder-scheduler-0\" (UID: \"626bc701-ba99-4de7-a2f9-b42eb150a783\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.675231 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/626bc701-ba99-4de7-a2f9-b42eb150a783-scripts\") pod \"cinder-scheduler-0\" (UID: \"626bc701-ba99-4de7-a2f9-b42eb150a783\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.776797 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626bc701-ba99-4de7-a2f9-b42eb150a783-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"626bc701-ba99-4de7-a2f9-b42eb150a783\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.776842 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626bc701-ba99-4de7-a2f9-b42eb150a783-config-data\") pod \"cinder-scheduler-0\" (UID: \"626bc701-ba99-4de7-a2f9-b42eb150a783\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.776867 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/626bc701-ba99-4de7-a2f9-b42eb150a783-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"626bc701-ba99-4de7-a2f9-b42eb150a783\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.776911 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/626bc701-ba99-4de7-a2f9-b42eb150a783-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"626bc701-ba99-4de7-a2f9-b42eb150a783\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.776929 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc27j\" (UniqueName: \"kubernetes.io/projected/626bc701-ba99-4de7-a2f9-b42eb150a783-kube-api-access-bc27j\") pod \"cinder-scheduler-0\" (UID: \"626bc701-ba99-4de7-a2f9-b42eb150a783\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.776964 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/626bc701-ba99-4de7-a2f9-b42eb150a783-scripts\") pod \"cinder-scheduler-0\" (UID: \"626bc701-ba99-4de7-a2f9-b42eb150a783\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.777624 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/626bc701-ba99-4de7-a2f9-b42eb150a783-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"626bc701-ba99-4de7-a2f9-b42eb150a783\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.780978 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626bc701-ba99-4de7-a2f9-b42eb150a783-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"626bc701-ba99-4de7-a2f9-b42eb150a783\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.781009 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/626bc701-ba99-4de7-a2f9-b42eb150a783-scripts\") pod \"cinder-scheduler-0\" (UID: \"626bc701-ba99-4de7-a2f9-b42eb150a783\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.781691 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/626bc701-ba99-4de7-a2f9-b42eb150a783-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"626bc701-ba99-4de7-a2f9-b42eb150a783\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.781821 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626bc701-ba99-4de7-a2f9-b42eb150a783-config-data\") pod \"cinder-scheduler-0\" (UID: \"626bc701-ba99-4de7-a2f9-b42eb150a783\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.795950 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc27j\" (UniqueName: \"kubernetes.io/projected/626bc701-ba99-4de7-a2f9-b42eb150a783-kube-api-access-bc27j\") pod \"cinder-scheduler-0\" (UID: \"626bc701-ba99-4de7-a2f9-b42eb150a783\") " pod="openstack/cinder-scheduler-0" Mar 13 09:32:37 crc kubenswrapper[4841]: I0313 09:32:37.823513 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 09:32:38 crc kubenswrapper[4841]: I0313 09:32:38.014160 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="268afc1a-8785-48f7-9299-4d47d14f6ad2" path="/var/lib/kubelet/pods/268afc1a-8785-48f7-9299-4d47d14f6ad2/volumes" Mar 13 09:32:38 crc kubenswrapper[4841]: I0313 09:32:38.265142 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 09:32:38 crc kubenswrapper[4841]: W0313 09:32:38.277657 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod626bc701_ba99_4de7_a2f9_b42eb150a783.slice/crio-a50502a62fcc613ae289fba0f7b892171e250f250fedef6bd654cffd2c91657f WatchSource:0}: Error finding container a50502a62fcc613ae289fba0f7b892171e250f250fedef6bd654cffd2c91657f: Status 404 returned error can't find the container with id a50502a62fcc613ae289fba0f7b892171e250f250fedef6bd654cffd2c91657f Mar 13 09:32:39 crc kubenswrapper[4841]: I0313 09:32:39.177994 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"626bc701-ba99-4de7-a2f9-b42eb150a783","Type":"ContainerStarted","Data":"1b21c58a40e7c560dcdac816d439ac3956bc2be488329aaace3a6eb0722fdd2d"} Mar 13 09:32:39 crc kubenswrapper[4841]: I0313 09:32:39.178528 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"626bc701-ba99-4de7-a2f9-b42eb150a783","Type":"ContainerStarted","Data":"a50502a62fcc613ae289fba0f7b892171e250f250fedef6bd654cffd2c91657f"} Mar 13 09:32:40 crc kubenswrapper[4841]: I0313 09:32:40.187687 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"626bc701-ba99-4de7-a2f9-b42eb150a783","Type":"ContainerStarted","Data":"ad05df05e011c1f59ff1864a54c9e0df774760a3d2d56f6f3017bcca6d13c480"} Mar 13 09:32:40 crc kubenswrapper[4841]: I0313 09:32:40.207392 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.207371664 podStartE2EDuration="3.207371664s" podCreationTimestamp="2026-03-13 09:32:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:32:40.204061121 +0000 UTC m=+1242.933961312" watchObservedRunningTime="2026-03-13 09:32:40.207371664 +0000 UTC m=+1242.937271855" Mar 13 09:32:40 crc kubenswrapper[4841]: I0313 09:32:40.495724 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 13 09:32:40 crc kubenswrapper[4841]: I0313 09:32:40.905857 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 09:32:40 crc kubenswrapper[4841]: I0313 09:32:40.907518 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 09:32:40 crc kubenswrapper[4841]: I0313 09:32:40.909575 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 13 09:32:40 crc kubenswrapper[4841]: I0313 09:32:40.910958 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 13 09:32:40 crc kubenswrapper[4841]: I0313 09:32:40.911321 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-7zhzz" Mar 13 09:32:40 crc kubenswrapper[4841]: I0313 09:32:40.924926 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.042692 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrdsw\" (UniqueName: \"kubernetes.io/projected/df3b0fd2-0003-42f1-b746-72231cfad7a0-kube-api-access-xrdsw\") pod \"openstackclient\" (UID: \"df3b0fd2-0003-42f1-b746-72231cfad7a0\") " pod="openstack/openstackclient" Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.042788 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3b0fd2-0003-42f1-b746-72231cfad7a0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"df3b0fd2-0003-42f1-b746-72231cfad7a0\") " pod="openstack/openstackclient" Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.042807 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/df3b0fd2-0003-42f1-b746-72231cfad7a0-openstack-config\") pod \"openstackclient\" (UID: \"df3b0fd2-0003-42f1-b746-72231cfad7a0\") " pod="openstack/openstackclient" Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.042882 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/df3b0fd2-0003-42f1-b746-72231cfad7a0-openstack-config-secret\") pod \"openstackclient\" (UID: \"df3b0fd2-0003-42f1-b746-72231cfad7a0\") " pod="openstack/openstackclient" Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.050968 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.051513 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerName="ceilometer-central-agent" containerID="cri-o://4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3" gracePeriod=30 Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.051616 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerName="ceilometer-notification-agent" containerID="cri-o://11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7" gracePeriod=30 Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.051615 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerName="proxy-httpd" containerID="cri-o://4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d" gracePeriod=30 Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.051621 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerName="sg-core" containerID="cri-o://9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e" gracePeriod=30 Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.066662 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.166:3000/\": EOF" Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.144527 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrdsw\" (UniqueName: \"kubernetes.io/projected/df3b0fd2-0003-42f1-b746-72231cfad7a0-kube-api-access-xrdsw\") pod \"openstackclient\" (UID: \"df3b0fd2-0003-42f1-b746-72231cfad7a0\") " pod="openstack/openstackclient" Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.144623 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3b0fd2-0003-42f1-b746-72231cfad7a0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"df3b0fd2-0003-42f1-b746-72231cfad7a0\") " pod="openstack/openstackclient" Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.144642 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/df3b0fd2-0003-42f1-b746-72231cfad7a0-openstack-config\") pod \"openstackclient\" (UID: \"df3b0fd2-0003-42f1-b746-72231cfad7a0\") " pod="openstack/openstackclient" Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.144701 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/df3b0fd2-0003-42f1-b746-72231cfad7a0-openstack-config-secret\") pod \"openstackclient\" (UID: \"df3b0fd2-0003-42f1-b746-72231cfad7a0\") " pod="openstack/openstackclient" Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.145849 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/df3b0fd2-0003-42f1-b746-72231cfad7a0-openstack-config\") pod \"openstackclient\" (UID: \"df3b0fd2-0003-42f1-b746-72231cfad7a0\") " pod="openstack/openstackclient" Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.149689 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/df3b0fd2-0003-42f1-b746-72231cfad7a0-openstack-config-secret\") pod \"openstackclient\" (UID: \"df3b0fd2-0003-42f1-b746-72231cfad7a0\") " pod="openstack/openstackclient" Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.151073 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3b0fd2-0003-42f1-b746-72231cfad7a0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"df3b0fd2-0003-42f1-b746-72231cfad7a0\") " pod="openstack/openstackclient" Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.164488 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrdsw\" (UniqueName: \"kubernetes.io/projected/df3b0fd2-0003-42f1-b746-72231cfad7a0-kube-api-access-xrdsw\") pod \"openstackclient\" (UID: \"df3b0fd2-0003-42f1-b746-72231cfad7a0\") " pod="openstack/openstackclient" Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.217414 4841 generic.go:334] "Generic (PLEG): container finished" podID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerID="9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e" exitCode=2 Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.217718 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"212126dc-6eaf-498b-b5db-2f24ed74e6a0","Type":"ContainerDied","Data":"9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e"} Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.222812 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 09:32:41 crc kubenswrapper[4841]: W0313 09:32:41.730227 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf3b0fd2_0003_42f1_b746_72231cfad7a0.slice/crio-5c651b40a189421f36bdafcb2c072b348fe0e2235150fc6da68407cf699c9a49 WatchSource:0}: Error finding container 5c651b40a189421f36bdafcb2c072b348fe0e2235150fc6da68407cf699c9a49: Status 404 returned error can't find the container with id 5c651b40a189421f36bdafcb2c072b348fe0e2235150fc6da68407cf699c9a49 Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.732308 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 09:32:41 crc kubenswrapper[4841]: I0313 09:32:41.949632 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.063387 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/212126dc-6eaf-498b-b5db-2f24ed74e6a0-run-httpd\") pod \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.063771 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/212126dc-6eaf-498b-b5db-2f24ed74e6a0-log-httpd\") pod \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.063811 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-combined-ca-bundle\") pod \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.063898 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-sg-core-conf-yaml\") pod \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.063952 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-config-data\") pod \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.063985 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-scripts\") pod \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.064017 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj9p9\" (UniqueName: \"kubernetes.io/projected/212126dc-6eaf-498b-b5db-2f24ed74e6a0-kube-api-access-cj9p9\") pod \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\" (UID: \"212126dc-6eaf-498b-b5db-2f24ed74e6a0\") " Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.064249 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/212126dc-6eaf-498b-b5db-2f24ed74e6a0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "212126dc-6eaf-498b-b5db-2f24ed74e6a0" (UID: "212126dc-6eaf-498b-b5db-2f24ed74e6a0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.064341 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/212126dc-6eaf-498b-b5db-2f24ed74e6a0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "212126dc-6eaf-498b-b5db-2f24ed74e6a0" (UID: "212126dc-6eaf-498b-b5db-2f24ed74e6a0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.064662 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/212126dc-6eaf-498b-b5db-2f24ed74e6a0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.064688 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/212126dc-6eaf-498b-b5db-2f24ed74e6a0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.071215 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/212126dc-6eaf-498b-b5db-2f24ed74e6a0-kube-api-access-cj9p9" (OuterVolumeSpecName: "kube-api-access-cj9p9") pod "212126dc-6eaf-498b-b5db-2f24ed74e6a0" (UID: "212126dc-6eaf-498b-b5db-2f24ed74e6a0"). InnerVolumeSpecName "kube-api-access-cj9p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.071363 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-scripts" (OuterVolumeSpecName: "scripts") pod "212126dc-6eaf-498b-b5db-2f24ed74e6a0" (UID: "212126dc-6eaf-498b-b5db-2f24ed74e6a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.137430 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "212126dc-6eaf-498b-b5db-2f24ed74e6a0" (UID: "212126dc-6eaf-498b-b5db-2f24ed74e6a0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.147940 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7bbc77c95-pfg84"] Mar 13 09:32:42 crc kubenswrapper[4841]: E0313 09:32:42.148425 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerName="sg-core" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.148447 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerName="sg-core" Mar 13 09:32:42 crc kubenswrapper[4841]: E0313 09:32:42.148461 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerName="ceilometer-central-agent" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.148471 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerName="ceilometer-central-agent" Mar 13 09:32:42 crc kubenswrapper[4841]: E0313 09:32:42.148487 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerName="ceilometer-notification-agent" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.148495 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerName="ceilometer-notification-agent" Mar 13 09:32:42 crc kubenswrapper[4841]: E0313 09:32:42.148513 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerName="proxy-httpd" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.148520 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerName="proxy-httpd" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.148730 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerName="proxy-httpd" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.148748 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerName="ceilometer-notification-agent" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.148770 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerName="sg-core" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.148782 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerName="ceilometer-central-agent" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.165363 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7bbc77c95-pfg84"] Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.165482 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.166950 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.166987 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.167005 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj9p9\" (UniqueName: \"kubernetes.io/projected/212126dc-6eaf-498b-b5db-2f24ed74e6a0-kube-api-access-cj9p9\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.174738 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "212126dc-6eaf-498b-b5db-2f24ed74e6a0" (UID: "212126dc-6eaf-498b-b5db-2f24ed74e6a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.175534 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.175775 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.176310 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.227158 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"df3b0fd2-0003-42f1-b746-72231cfad7a0","Type":"ContainerStarted","Data":"5c651b40a189421f36bdafcb2c072b348fe0e2235150fc6da68407cf699c9a49"} Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.229493 4841 generic.go:334] "Generic (PLEG): container finished" podID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerID="4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d" exitCode=0 Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.229626 4841 generic.go:334] "Generic (PLEG): container finished" podID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerID="11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7" exitCode=0 Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.229702 4841 generic.go:334] "Generic (PLEG): container finished" podID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" containerID="4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3" exitCode=0 Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.229792 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"212126dc-6eaf-498b-b5db-2f24ed74e6a0","Type":"ContainerDied","Data":"4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d"} Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.229882 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"212126dc-6eaf-498b-b5db-2f24ed74e6a0","Type":"ContainerDied","Data":"11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7"} Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.229970 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"212126dc-6eaf-498b-b5db-2f24ed74e6a0","Type":"ContainerDied","Data":"4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3"} Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.230061 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"212126dc-6eaf-498b-b5db-2f24ed74e6a0","Type":"ContainerDied","Data":"97804d476c73068cedf59c0658a157b87f3cac66a66b2a66707de4f82f027b92"} Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.230038 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.229998 4841 scope.go:117] "RemoveContainer" containerID="4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.253312 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-config-data" (OuterVolumeSpecName: "config-data") pod "212126dc-6eaf-498b-b5db-2f24ed74e6a0" (UID: "212126dc-6eaf-498b-b5db-2f24ed74e6a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.268241 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7862f13-896f-480f-add9-376c2a96fdd7-config-data\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.268428 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7862f13-896f-480f-add9-376c2a96fdd7-internal-tls-certs\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.268533 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4jzl\" (UniqueName: \"kubernetes.io/projected/c7862f13-896f-480f-add9-376c2a96fdd7-kube-api-access-j4jzl\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.268633 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7862f13-896f-480f-add9-376c2a96fdd7-log-httpd\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.268668 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7862f13-896f-480f-add9-376c2a96fdd7-public-tls-certs\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.268687 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7862f13-896f-480f-add9-376c2a96fdd7-run-httpd\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.268717 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c7862f13-896f-480f-add9-376c2a96fdd7-etc-swift\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.268790 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7862f13-896f-480f-add9-376c2a96fdd7-combined-ca-bundle\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.268923 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.268949 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/212126dc-6eaf-498b-b5db-2f24ed74e6a0-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.349222 4841 scope.go:117] "RemoveContainer" containerID="9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.371305 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7862f13-896f-480f-add9-376c2a96fdd7-config-data\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.371385 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7862f13-896f-480f-add9-376c2a96fdd7-internal-tls-certs\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.371423 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4jzl\" (UniqueName: \"kubernetes.io/projected/c7862f13-896f-480f-add9-376c2a96fdd7-kube-api-access-j4jzl\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.371463 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7862f13-896f-480f-add9-376c2a96fdd7-log-httpd\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.371482 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7862f13-896f-480f-add9-376c2a96fdd7-public-tls-certs\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.371498 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7862f13-896f-480f-add9-376c2a96fdd7-run-httpd\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.371517 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c7862f13-896f-480f-add9-376c2a96fdd7-etc-swift\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.371554 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7862f13-896f-480f-add9-376c2a96fdd7-combined-ca-bundle\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.372716 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7862f13-896f-480f-add9-376c2a96fdd7-log-httpd\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.372791 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7862f13-896f-480f-add9-376c2a96fdd7-run-httpd\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.377475 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7862f13-896f-480f-add9-376c2a96fdd7-combined-ca-bundle\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.377661 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7862f13-896f-480f-add9-376c2a96fdd7-config-data\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.378217 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7862f13-896f-480f-add9-376c2a96fdd7-internal-tls-certs\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.379402 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7862f13-896f-480f-add9-376c2a96fdd7-public-tls-certs\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.380224 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c7862f13-896f-480f-add9-376c2a96fdd7-etc-swift\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.388554 4841 scope.go:117] "RemoveContainer" containerID="11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.396492 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4jzl\" (UniqueName: \"kubernetes.io/projected/c7862f13-896f-480f-add9-376c2a96fdd7-kube-api-access-j4jzl\") pod \"swift-proxy-7bbc77c95-pfg84\" (UID: \"c7862f13-896f-480f-add9-376c2a96fdd7\") " pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.439446 4841 scope.go:117] "RemoveContainer" containerID="4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.471450 4841 scope.go:117] "RemoveContainer" containerID="4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d" Mar 13 09:32:42 crc kubenswrapper[4841]: E0313 09:32:42.476315 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d\": container with ID starting with 4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d not found: ID does not exist" containerID="4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.476361 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d"} err="failed to get container status \"4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d\": rpc error: code = NotFound desc = could not find container \"4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d\": container with ID starting with 4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d not found: ID does not exist" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.476388 4841 scope.go:117] "RemoveContainer" containerID="9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e" Mar 13 09:32:42 crc kubenswrapper[4841]: E0313 09:32:42.477171 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e\": container with ID starting with 9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e not found: ID does not exist" containerID="9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.477197 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e"} err="failed to get container status \"9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e\": rpc error: code = NotFound desc = could not find container \"9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e\": container with ID starting with 9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e not found: ID does not exist" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.477211 4841 scope.go:117] "RemoveContainer" containerID="11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7" Mar 13 09:32:42 crc kubenswrapper[4841]: E0313 09:32:42.477683 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7\": container with ID starting with 11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7 not found: ID does not exist" containerID="11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.477722 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7"} err="failed to get container status \"11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7\": rpc error: code = NotFound desc = could not find container \"11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7\": container with ID starting with 11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7 not found: ID does not exist" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.477749 4841 scope.go:117] "RemoveContainer" containerID="4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3" Mar 13 09:32:42 crc kubenswrapper[4841]: E0313 09:32:42.478117 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3\": container with ID starting with 4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3 not found: ID does not exist" containerID="4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.478173 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3"} err="failed to get container status \"4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3\": rpc error: code = NotFound desc = could not find container \"4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3\": container with ID starting with 4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3 not found: ID does not exist" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.478186 4841 scope.go:117] "RemoveContainer" containerID="4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.478630 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d"} err="failed to get container status \"4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d\": rpc error: code = NotFound desc = could not find container \"4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d\": container with ID starting with 4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d not found: ID does not exist" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.478649 4841 scope.go:117] "RemoveContainer" containerID="9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.480183 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e"} err="failed to get container status \"9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e\": rpc error: code = NotFound desc = could not find container \"9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e\": container with ID starting with 9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e not found: ID does not exist" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.480205 4841 scope.go:117] "RemoveContainer" containerID="11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.480568 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7"} err="failed to get container status \"11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7\": rpc error: code = NotFound desc = could not find container \"11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7\": container with ID starting with 11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7 not found: ID does not exist" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.480600 4841 scope.go:117] "RemoveContainer" containerID="4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.480939 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3"} err="failed to get container status \"4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3\": rpc error: code = NotFound desc = could not find container \"4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3\": container with ID starting with 4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3 not found: ID does not exist" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.480963 4841 scope.go:117] "RemoveContainer" containerID="4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.481228 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d"} err="failed to get container status \"4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d\": rpc error: code = NotFound desc = could not find container \"4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d\": container with ID starting with 4ab3ab425bd957a27b73e412a9ed7a449e58942c20dd2f001dd624d0b92d5d5d not found: ID does not exist" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.481245 4841 scope.go:117] "RemoveContainer" containerID="9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.481547 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e"} err="failed to get container status \"9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e\": rpc error: code = NotFound desc = could not find container \"9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e\": container with ID starting with 9c899fdc5d353fdbe54598cccc9b01e85b6d2e32f40c65999d15606dc5f97b0e not found: ID does not exist" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.481571 4841 scope.go:117] "RemoveContainer" containerID="11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.481832 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7"} err="failed to get container status \"11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7\": rpc error: code = NotFound desc = could not find container \"11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7\": container with ID starting with 11e7decd53eb76cbaa81415f159e08d8e1641f93446caff9a5a04872f444d6f7 not found: ID does not exist" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.481851 4841 scope.go:117] "RemoveContainer" containerID="4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.482102 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3"} err="failed to get container status \"4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3\": rpc error: code = NotFound desc = could not find container \"4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3\": container with ID starting with 4ca6094e153e1d7ed3805daaeca7026c32363476eb7c69f707a51ae9d2dbd0e3 not found: ID does not exist" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.566885 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.579485 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.590974 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.601120 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.602831 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.603294 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.603618 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.647477 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.676358 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-config-data\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.676409 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.676426 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-run-httpd\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.676453 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-scripts\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.676742 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-log-httpd\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.676842 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcrfb\" (UniqueName: \"kubernetes.io/projected/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-kube-api-access-dcrfb\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.677047 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.778758 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-log-httpd\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.778810 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcrfb\" (UniqueName: \"kubernetes.io/projected/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-kube-api-access-dcrfb\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.778854 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.778899 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-config-data\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.778929 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.778945 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-run-httpd\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.778972 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-scripts\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.780436 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-run-httpd\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.780625 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-log-httpd\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.784374 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.785147 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-config-data\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.785765 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-scripts\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.786854 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.803424 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcrfb\" (UniqueName: \"kubernetes.io/projected/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-kube-api-access-dcrfb\") pod \"ceilometer-0\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " pod="openstack/ceilometer-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.824415 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 09:32:42 crc kubenswrapper[4841]: I0313 09:32:42.916210 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:32:43 crc kubenswrapper[4841]: I0313 09:32:43.173299 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7bbc77c95-pfg84"] Mar 13 09:32:43 crc kubenswrapper[4841]: I0313 09:32:43.253024 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bbc77c95-pfg84" event={"ID":"c7862f13-896f-480f-add9-376c2a96fdd7","Type":"ContainerStarted","Data":"4f53fad3b79c6efacba46a0c5978c4ff0f4aab9ff4fd057f338098f428489dcf"} Mar 13 09:32:43 crc kubenswrapper[4841]: I0313 09:32:43.392433 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:32:43 crc kubenswrapper[4841]: W0313 09:32:43.396763 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc9f2904_ee6a_4450_bf9d_bf730655ab1b.slice/crio-2b904637aedaab5c747f9661fedda31222cbe0edeb3e6fbbeabb42958ff9b834 WatchSource:0}: Error finding container 2b904637aedaab5c747f9661fedda31222cbe0edeb3e6fbbeabb42958ff9b834: Status 404 returned error can't find the container with id 2b904637aedaab5c747f9661fedda31222cbe0edeb3e6fbbeabb42958ff9b834 Mar 13 09:32:44 crc kubenswrapper[4841]: I0313 09:32:44.008173 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="212126dc-6eaf-498b-b5db-2f24ed74e6a0" path="/var/lib/kubelet/pods/212126dc-6eaf-498b-b5db-2f24ed74e6a0/volumes" Mar 13 09:32:44 crc kubenswrapper[4841]: I0313 09:32:44.264995 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bbc77c95-pfg84" event={"ID":"c7862f13-896f-480f-add9-376c2a96fdd7","Type":"ContainerStarted","Data":"774b95396a5d4dd6679923f1d74dfbd2adbf83e619763a637c829ad63f0fff26"} Mar 13 09:32:44 crc kubenswrapper[4841]: I0313 09:32:44.265632 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:44 crc kubenswrapper[4841]: I0313 09:32:44.265648 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bbc77c95-pfg84" event={"ID":"c7862f13-896f-480f-add9-376c2a96fdd7","Type":"ContainerStarted","Data":"dc59188a0209971907fc9ec5641d6d6ca6477ddac2a744627b40955182e47499"} Mar 13 09:32:44 crc kubenswrapper[4841]: I0313 09:32:44.265662 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:44 crc kubenswrapper[4841]: I0313 09:32:44.267285 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9f2904-ee6a-4450-bf9d-bf730655ab1b","Type":"ContainerStarted","Data":"66114ef1565934630d1ec336af2ae7fa19efe8c2fbc0d765e957ff00cf32730c"} Mar 13 09:32:44 crc kubenswrapper[4841]: I0313 09:32:44.267319 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9f2904-ee6a-4450-bf9d-bf730655ab1b","Type":"ContainerStarted","Data":"2b904637aedaab5c747f9661fedda31222cbe0edeb3e6fbbeabb42958ff9b834"} Mar 13 09:32:44 crc kubenswrapper[4841]: I0313 09:32:44.285962 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7bbc77c95-pfg84" podStartSLOduration=2.285944917 podStartE2EDuration="2.285944917s" podCreationTimestamp="2026-03-13 09:32:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:32:44.283870963 +0000 UTC m=+1247.013771154" watchObservedRunningTime="2026-03-13 09:32:44.285944917 +0000 UTC m=+1247.015845098" Mar 13 09:32:44 crc kubenswrapper[4841]: I0313 09:32:44.985836 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-76777bf9d9-sc2jp"] Mar 13 09:32:44 crc kubenswrapper[4841]: I0313 09:32:44.987489 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76777bf9d9-sc2jp" Mar 13 09:32:44 crc kubenswrapper[4841]: I0313 09:32:44.989707 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-zzq55" Mar 13 09:32:44 crc kubenswrapper[4841]: I0313 09:32:44.989814 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 13 09:32:44 crc kubenswrapper[4841]: I0313 09:32:44.994996 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.014308 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-76777bf9d9-sc2jp"] Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.096936 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-lqsbq"] Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.098388 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.137321 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-lqsbq"] Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.137348 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d53e19b-a3e7-40de-a06b-d6e1c0def922-config-data-custom\") pod \"heat-engine-76777bf9d9-sc2jp\" (UID: \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\") " pod="openstack/heat-engine-76777bf9d9-sc2jp" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.137387 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d53e19b-a3e7-40de-a06b-d6e1c0def922-combined-ca-bundle\") pod \"heat-engine-76777bf9d9-sc2jp\" (UID: \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\") " pod="openstack/heat-engine-76777bf9d9-sc2jp" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.137441 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5s6g\" (UniqueName: \"kubernetes.io/projected/9d53e19b-a3e7-40de-a06b-d6e1c0def922-kube-api-access-l5s6g\") pod \"heat-engine-76777bf9d9-sc2jp\" (UID: \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\") " pod="openstack/heat-engine-76777bf9d9-sc2jp" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.137495 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d53e19b-a3e7-40de-a06b-d6e1c0def922-config-data\") pod \"heat-engine-76777bf9d9-sc2jp\" (UID: \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\") " pod="openstack/heat-engine-76777bf9d9-sc2jp" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.181950 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-77dcd58b46-2hq68"] Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.183181 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-77dcd58b46-2hq68" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.195048 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.212846 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-77dcd58b46-2hq68"] Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.239100 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5s6g\" (UniqueName: \"kubernetes.io/projected/9d53e19b-a3e7-40de-a06b-d6e1c0def922-kube-api-access-l5s6g\") pod \"heat-engine-76777bf9d9-sc2jp\" (UID: \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\") " pod="openstack/heat-engine-76777bf9d9-sc2jp" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.239137 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-lqsbq\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.239155 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27gcv\" (UniqueName: \"kubernetes.io/projected/c4495f16-0265-4bb4-a725-0f1f3da3387f-kube-api-access-27gcv\") pod \"heat-cfnapi-77dcd58b46-2hq68\" (UID: \"c4495f16-0265-4bb4-a725-0f1f3da3387f\") " pod="openstack/heat-cfnapi-77dcd58b46-2hq68" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.239184 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4495f16-0265-4bb4-a725-0f1f3da3387f-config-data\") pod \"heat-cfnapi-77dcd58b46-2hq68\" (UID: \"c4495f16-0265-4bb4-a725-0f1f3da3387f\") " pod="openstack/heat-cfnapi-77dcd58b46-2hq68" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.239219 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4495f16-0265-4bb4-a725-0f1f3da3387f-combined-ca-bundle\") pod \"heat-cfnapi-77dcd58b46-2hq68\" (UID: \"c4495f16-0265-4bb4-a725-0f1f3da3387f\") " pod="openstack/heat-cfnapi-77dcd58b46-2hq68" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.239237 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-lqsbq\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.239255 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-lqsbq\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.239286 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d53e19b-a3e7-40de-a06b-d6e1c0def922-config-data\") pod \"heat-engine-76777bf9d9-sc2jp\" (UID: \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\") " pod="openstack/heat-engine-76777bf9d9-sc2jp" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.239348 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4495f16-0265-4bb4-a725-0f1f3da3387f-config-data-custom\") pod \"heat-cfnapi-77dcd58b46-2hq68\" (UID: \"c4495f16-0265-4bb4-a725-0f1f3da3387f\") " pod="openstack/heat-cfnapi-77dcd58b46-2hq68" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.239378 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bw96\" (UniqueName: \"kubernetes.io/projected/9347a285-b6a7-46ba-9d5d-fd204673894b-kube-api-access-2bw96\") pod \"dnsmasq-dns-7756b9d78c-lqsbq\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.239412 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-lqsbq\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.239433 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d53e19b-a3e7-40de-a06b-d6e1c0def922-config-data-custom\") pod \"heat-engine-76777bf9d9-sc2jp\" (UID: \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\") " pod="openstack/heat-engine-76777bf9d9-sc2jp" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.239449 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d53e19b-a3e7-40de-a06b-d6e1c0def922-combined-ca-bundle\") pod \"heat-engine-76777bf9d9-sc2jp\" (UID: \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\") " pod="openstack/heat-engine-76777bf9d9-sc2jp" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.239468 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-config\") pod \"dnsmasq-dns-7756b9d78c-lqsbq\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.250513 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-78489f9c65-6dcg2"] Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.251276 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d53e19b-a3e7-40de-a06b-d6e1c0def922-config-data-custom\") pod \"heat-engine-76777bf9d9-sc2jp\" (UID: \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\") " pod="openstack/heat-engine-76777bf9d9-sc2jp" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.251982 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-78489f9c65-6dcg2" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.253112 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d53e19b-a3e7-40de-a06b-d6e1c0def922-config-data\") pod \"heat-engine-76777bf9d9-sc2jp\" (UID: \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\") " pod="openstack/heat-engine-76777bf9d9-sc2jp" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.257967 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.258786 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d53e19b-a3e7-40de-a06b-d6e1c0def922-combined-ca-bundle\") pod \"heat-engine-76777bf9d9-sc2jp\" (UID: \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\") " pod="openstack/heat-engine-76777bf9d9-sc2jp" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.277645 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-78489f9c65-6dcg2"] Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.351316 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bw96\" (UniqueName: \"kubernetes.io/projected/9347a285-b6a7-46ba-9d5d-fd204673894b-kube-api-access-2bw96\") pod \"dnsmasq-dns-7756b9d78c-lqsbq\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.354248 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9f2904-ee6a-4450-bf9d-bf730655ab1b","Type":"ContainerStarted","Data":"f5defaf5fb910ed285ba72c4cfa89b64e64a8a411a15614bacec8a3a760971f7"} Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.362601 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-lqsbq\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.362778 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-config\") pod \"dnsmasq-dns-7756b9d78c-lqsbq\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.362891 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tw87\" (UniqueName: \"kubernetes.io/projected/a28ce499-0db0-4fae-88ff-35a61bb3df64-kube-api-access-5tw87\") pod \"heat-api-78489f9c65-6dcg2\" (UID: \"a28ce499-0db0-4fae-88ff-35a61bb3df64\") " pod="openstack/heat-api-78489f9c65-6dcg2" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.363070 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a28ce499-0db0-4fae-88ff-35a61bb3df64-config-data\") pod \"heat-api-78489f9c65-6dcg2\" (UID: \"a28ce499-0db0-4fae-88ff-35a61bb3df64\") " pod="openstack/heat-api-78489f9c65-6dcg2" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.363165 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a28ce499-0db0-4fae-88ff-35a61bb3df64-config-data-custom\") pod \"heat-api-78489f9c65-6dcg2\" (UID: \"a28ce499-0db0-4fae-88ff-35a61bb3df64\") " pod="openstack/heat-api-78489f9c65-6dcg2" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.363250 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-lqsbq\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.363368 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27gcv\" (UniqueName: \"kubernetes.io/projected/c4495f16-0265-4bb4-a725-0f1f3da3387f-kube-api-access-27gcv\") pod \"heat-cfnapi-77dcd58b46-2hq68\" (UID: \"c4495f16-0265-4bb4-a725-0f1f3da3387f\") " pod="openstack/heat-cfnapi-77dcd58b46-2hq68" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.363497 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4495f16-0265-4bb4-a725-0f1f3da3387f-config-data\") pod \"heat-cfnapi-77dcd58b46-2hq68\" (UID: \"c4495f16-0265-4bb4-a725-0f1f3da3387f\") " pod="openstack/heat-cfnapi-77dcd58b46-2hq68" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.363663 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4495f16-0265-4bb4-a725-0f1f3da3387f-combined-ca-bundle\") pod \"heat-cfnapi-77dcd58b46-2hq68\" (UID: \"c4495f16-0265-4bb4-a725-0f1f3da3387f\") " pod="openstack/heat-cfnapi-77dcd58b46-2hq68" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.363766 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28ce499-0db0-4fae-88ff-35a61bb3df64-combined-ca-bundle\") pod \"heat-api-78489f9c65-6dcg2\" (UID: \"a28ce499-0db0-4fae-88ff-35a61bb3df64\") " pod="openstack/heat-api-78489f9c65-6dcg2" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.363861 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-lqsbq\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.363961 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-lqsbq\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.364117 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-lqsbq\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.364347 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4495f16-0265-4bb4-a725-0f1f3da3387f-config-data-custom\") pod \"heat-cfnapi-77dcd58b46-2hq68\" (UID: \"c4495f16-0265-4bb4-a725-0f1f3da3387f\") " pod="openstack/heat-cfnapi-77dcd58b46-2hq68" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.364971 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5s6g\" (UniqueName: \"kubernetes.io/projected/9d53e19b-a3e7-40de-a06b-d6e1c0def922-kube-api-access-l5s6g\") pod \"heat-engine-76777bf9d9-sc2jp\" (UID: \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\") " pod="openstack/heat-engine-76777bf9d9-sc2jp" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.365923 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-lqsbq\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.367061 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-config\") pod \"dnsmasq-dns-7756b9d78c-lqsbq\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.368569 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-lqsbq\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.374729 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-lqsbq\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.381372 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4495f16-0265-4bb4-a725-0f1f3da3387f-config-data\") pod \"heat-cfnapi-77dcd58b46-2hq68\" (UID: \"c4495f16-0265-4bb4-a725-0f1f3da3387f\") " pod="openstack/heat-cfnapi-77dcd58b46-2hq68" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.386741 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76777bf9d9-sc2jp" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.389852 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bw96\" (UniqueName: \"kubernetes.io/projected/9347a285-b6a7-46ba-9d5d-fd204673894b-kube-api-access-2bw96\") pod \"dnsmasq-dns-7756b9d78c-lqsbq\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.395013 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27gcv\" (UniqueName: \"kubernetes.io/projected/c4495f16-0265-4bb4-a725-0f1f3da3387f-kube-api-access-27gcv\") pod \"heat-cfnapi-77dcd58b46-2hq68\" (UID: \"c4495f16-0265-4bb4-a725-0f1f3da3387f\") " pod="openstack/heat-cfnapi-77dcd58b46-2hq68" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.402955 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4495f16-0265-4bb4-a725-0f1f3da3387f-combined-ca-bundle\") pod \"heat-cfnapi-77dcd58b46-2hq68\" (UID: \"c4495f16-0265-4bb4-a725-0f1f3da3387f\") " pod="openstack/heat-cfnapi-77dcd58b46-2hq68" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.406522 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4495f16-0265-4bb4-a725-0f1f3da3387f-config-data-custom\") pod \"heat-cfnapi-77dcd58b46-2hq68\" (UID: \"c4495f16-0265-4bb4-a725-0f1f3da3387f\") " pod="openstack/heat-cfnapi-77dcd58b46-2hq68" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.490764 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.491559 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tw87\" (UniqueName: \"kubernetes.io/projected/a28ce499-0db0-4fae-88ff-35a61bb3df64-kube-api-access-5tw87\") pod \"heat-api-78489f9c65-6dcg2\" (UID: \"a28ce499-0db0-4fae-88ff-35a61bb3df64\") " pod="openstack/heat-api-78489f9c65-6dcg2" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.491639 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a28ce499-0db0-4fae-88ff-35a61bb3df64-config-data\") pod \"heat-api-78489f9c65-6dcg2\" (UID: \"a28ce499-0db0-4fae-88ff-35a61bb3df64\") " pod="openstack/heat-api-78489f9c65-6dcg2" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.491662 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a28ce499-0db0-4fae-88ff-35a61bb3df64-config-data-custom\") pod \"heat-api-78489f9c65-6dcg2\" (UID: \"a28ce499-0db0-4fae-88ff-35a61bb3df64\") " pod="openstack/heat-api-78489f9c65-6dcg2" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.491733 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28ce499-0db0-4fae-88ff-35a61bb3df64-combined-ca-bundle\") pod \"heat-api-78489f9c65-6dcg2\" (UID: \"a28ce499-0db0-4fae-88ff-35a61bb3df64\") " pod="openstack/heat-api-78489f9c65-6dcg2" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.496745 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28ce499-0db0-4fae-88ff-35a61bb3df64-combined-ca-bundle\") pod \"heat-api-78489f9c65-6dcg2\" (UID: \"a28ce499-0db0-4fae-88ff-35a61bb3df64\") " pod="openstack/heat-api-78489f9c65-6dcg2" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.498324 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a28ce499-0db0-4fae-88ff-35a61bb3df64-config-data\") pod \"heat-api-78489f9c65-6dcg2\" (UID: \"a28ce499-0db0-4fae-88ff-35a61bb3df64\") " pod="openstack/heat-api-78489f9c65-6dcg2" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.507071 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a28ce499-0db0-4fae-88ff-35a61bb3df64-config-data-custom\") pod \"heat-api-78489f9c65-6dcg2\" (UID: \"a28ce499-0db0-4fae-88ff-35a61bb3df64\") " pod="openstack/heat-api-78489f9c65-6dcg2" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.521322 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tw87\" (UniqueName: \"kubernetes.io/projected/a28ce499-0db0-4fae-88ff-35a61bb3df64-kube-api-access-5tw87\") pod \"heat-api-78489f9c65-6dcg2\" (UID: \"a28ce499-0db0-4fae-88ff-35a61bb3df64\") " pod="openstack/heat-api-78489f9c65-6dcg2" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.554385 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-77dcd58b46-2hq68" Mar 13 09:32:45 crc kubenswrapper[4841]: I0313 09:32:45.562405 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-78489f9c65-6dcg2" Mar 13 09:32:46 crc kubenswrapper[4841]: I0313 09:32:46.118431 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-lqsbq"] Mar 13 09:32:46 crc kubenswrapper[4841]: W0313 09:32:46.119590 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9347a285_b6a7_46ba_9d5d_fd204673894b.slice/crio-bf295b20a125ce70fedb3c6f2e7aba0b1f16b6d67cc2841462f7b42b1af412da WatchSource:0}: Error finding container bf295b20a125ce70fedb3c6f2e7aba0b1f16b6d67cc2841462f7b42b1af412da: Status 404 returned error can't find the container with id bf295b20a125ce70fedb3c6f2e7aba0b1f16b6d67cc2841462f7b42b1af412da Mar 13 09:32:46 crc kubenswrapper[4841]: I0313 09:32:46.237476 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-76777bf9d9-sc2jp"] Mar 13 09:32:46 crc kubenswrapper[4841]: I0313 09:32:46.245844 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-77dcd58b46-2hq68"] Mar 13 09:32:46 crc kubenswrapper[4841]: W0313 09:32:46.246230 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4495f16_0265_4bb4_a725_0f1f3da3387f.slice/crio-563157c931c06779fdce30595cd2ae2e8bfae2637d13b1f79242ceb60ba64504 WatchSource:0}: Error finding container 563157c931c06779fdce30595cd2ae2e8bfae2637d13b1f79242ceb60ba64504: Status 404 returned error can't find the container with id 563157c931c06779fdce30595cd2ae2e8bfae2637d13b1f79242ceb60ba64504 Mar 13 09:32:46 crc kubenswrapper[4841]: W0313 09:32:46.252669 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d53e19b_a3e7_40de_a06b_d6e1c0def922.slice/crio-c6c9bca2ea71951b782faa8ba13bed7310745f9d9517743884778cf8816624f4 WatchSource:0}: Error finding container c6c9bca2ea71951b782faa8ba13bed7310745f9d9517743884778cf8816624f4: Status 404 returned error can't find the container with id c6c9bca2ea71951b782faa8ba13bed7310745f9d9517743884778cf8816624f4 Mar 13 09:32:46 crc kubenswrapper[4841]: I0313 09:32:46.368543 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-78489f9c65-6dcg2"] Mar 13 09:32:46 crc kubenswrapper[4841]: I0313 09:32:46.372367 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-77dcd58b46-2hq68" event={"ID":"c4495f16-0265-4bb4-a725-0f1f3da3387f","Type":"ContainerStarted","Data":"563157c931c06779fdce30595cd2ae2e8bfae2637d13b1f79242ceb60ba64504"} Mar 13 09:32:46 crc kubenswrapper[4841]: W0313 09:32:46.376046 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda28ce499_0db0_4fae_88ff_35a61bb3df64.slice/crio-40fefcd5936bdc07ea74c3c3753deea8d120311057ef6140a8213f87d77949bc WatchSource:0}: Error finding container 40fefcd5936bdc07ea74c3c3753deea8d120311057ef6140a8213f87d77949bc: Status 404 returned error can't find the container with id 40fefcd5936bdc07ea74c3c3753deea8d120311057ef6140a8213f87d77949bc Mar 13 09:32:46 crc kubenswrapper[4841]: I0313 09:32:46.376168 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9f2904-ee6a-4450-bf9d-bf730655ab1b","Type":"ContainerStarted","Data":"1bd3a4acfc656fa437003e466b7ab8f833aa3d2c1a8e8f05cf8fbd719ac74d03"} Mar 13 09:32:46 crc kubenswrapper[4841]: I0313 09:32:46.378158 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-76777bf9d9-sc2jp" event={"ID":"9d53e19b-a3e7-40de-a06b-d6e1c0def922","Type":"ContainerStarted","Data":"c6c9bca2ea71951b782faa8ba13bed7310745f9d9517743884778cf8816624f4"} Mar 13 09:32:46 crc kubenswrapper[4841]: I0313 09:32:46.382362 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" event={"ID":"9347a285-b6a7-46ba-9d5d-fd204673894b","Type":"ContainerStarted","Data":"bf295b20a125ce70fedb3c6f2e7aba0b1f16b6d67cc2841462f7b42b1af412da"} Mar 13 09:32:47 crc kubenswrapper[4841]: I0313 09:32:47.412842 4841 generic.go:334] "Generic (PLEG): container finished" podID="9347a285-b6a7-46ba-9d5d-fd204673894b" containerID="e18228d36d65e70c1741a2fb877f22a1b30b678d3244bf87556dbd8fdce02e3b" exitCode=0 Mar 13 09:32:47 crc kubenswrapper[4841]: I0313 09:32:47.413178 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" event={"ID":"9347a285-b6a7-46ba-9d5d-fd204673894b","Type":"ContainerDied","Data":"e18228d36d65e70c1741a2fb877f22a1b30b678d3244bf87556dbd8fdce02e3b"} Mar 13 09:32:47 crc kubenswrapper[4841]: I0313 09:32:47.419968 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-76777bf9d9-sc2jp" event={"ID":"9d53e19b-a3e7-40de-a06b-d6e1c0def922","Type":"ContainerStarted","Data":"ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f"} Mar 13 09:32:47 crc kubenswrapper[4841]: I0313 09:32:47.420163 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-76777bf9d9-sc2jp" Mar 13 09:32:47 crc kubenswrapper[4841]: I0313 09:32:47.434216 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-78489f9c65-6dcg2" event={"ID":"a28ce499-0db0-4fae-88ff-35a61bb3df64","Type":"ContainerStarted","Data":"40fefcd5936bdc07ea74c3c3753deea8d120311057ef6140a8213f87d77949bc"} Mar 13 09:32:47 crc kubenswrapper[4841]: I0313 09:32:47.465785 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-76777bf9d9-sc2jp" podStartSLOduration=3.4657662 podStartE2EDuration="3.4657662s" podCreationTimestamp="2026-03-13 09:32:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:32:47.460560188 +0000 UTC m=+1250.190460389" watchObservedRunningTime="2026-03-13 09:32:47.4657662 +0000 UTC m=+1250.195666391" Mar 13 09:32:48 crc kubenswrapper[4841]: I0313 09:32:48.064727 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 09:32:48 crc kubenswrapper[4841]: I0313 09:32:48.452780 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9f2904-ee6a-4450-bf9d-bf730655ab1b","Type":"ContainerStarted","Data":"58cb98155a159b05b44da3d4b473514922242cafa94da9bc38d801f03564412f"} Mar 13 09:32:48 crc kubenswrapper[4841]: I0313 09:32:48.452831 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 09:32:48 crc kubenswrapper[4841]: I0313 09:32:48.486116 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.756438587 podStartE2EDuration="6.486099114s" podCreationTimestamp="2026-03-13 09:32:42 +0000 UTC" firstStartedPulling="2026-03-13 09:32:43.399655185 +0000 UTC m=+1246.129555376" lastFinishedPulling="2026-03-13 09:32:47.129315712 +0000 UTC m=+1249.859215903" observedRunningTime="2026-03-13 09:32:48.472970575 +0000 UTC m=+1251.202870766" watchObservedRunningTime="2026-03-13 09:32:48.486099114 +0000 UTC m=+1251.215999305" Mar 13 09:32:49 crc kubenswrapper[4841]: I0313 09:32:49.480943 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-78489f9c65-6dcg2" event={"ID":"a28ce499-0db0-4fae-88ff-35a61bb3df64","Type":"ContainerStarted","Data":"ce58e81af6719161e5a2965021e2f761ea54ec4b73292768362474bd1d043999"} Mar 13 09:32:49 crc kubenswrapper[4841]: I0313 09:32:49.481462 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-78489f9c65-6dcg2" Mar 13 09:32:49 crc kubenswrapper[4841]: I0313 09:32:49.488719 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" event={"ID":"9347a285-b6a7-46ba-9d5d-fd204673894b","Type":"ContainerStarted","Data":"17189d8602436420a037b24e9cfdb7e231161da5c8d119c5b57e17a7ee984189"} Mar 13 09:32:49 crc kubenswrapper[4841]: I0313 09:32:49.489422 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:49 crc kubenswrapper[4841]: I0313 09:32:49.501612 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-77dcd58b46-2hq68" event={"ID":"c4495f16-0265-4bb4-a725-0f1f3da3387f","Type":"ContainerStarted","Data":"42e05a6b89dc9c913880642ff5e05c706ffa5e68c98ab5297c060f0bfca37084"} Mar 13 09:32:49 crc kubenswrapper[4841]: I0313 09:32:49.504554 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-78489f9c65-6dcg2" podStartSLOduration=2.154611182 podStartE2EDuration="4.50452623s" podCreationTimestamp="2026-03-13 09:32:45 +0000 UTC" firstStartedPulling="2026-03-13 09:32:46.379502608 +0000 UTC m=+1249.109402799" lastFinishedPulling="2026-03-13 09:32:48.729417656 +0000 UTC m=+1251.459317847" observedRunningTime="2026-03-13 09:32:49.501061053 +0000 UTC m=+1252.230961244" watchObservedRunningTime="2026-03-13 09:32:49.50452623 +0000 UTC m=+1252.234426421" Mar 13 09:32:49 crc kubenswrapper[4841]: I0313 09:32:49.549655 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" podStartSLOduration=4.549618797 podStartE2EDuration="4.549618797s" podCreationTimestamp="2026-03-13 09:32:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:32:49.528805588 +0000 UTC m=+1252.258705779" watchObservedRunningTime="2026-03-13 09:32:49.549618797 +0000 UTC m=+1252.279518988" Mar 13 09:32:49 crc kubenswrapper[4841]: I0313 09:32:49.581202 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-77dcd58b46-2hq68" podStartSLOduration=2.099409578 podStartE2EDuration="4.581170631s" podCreationTimestamp="2026-03-13 09:32:45 +0000 UTC" firstStartedPulling="2026-03-13 09:32:46.254079964 +0000 UTC m=+1248.983980155" lastFinishedPulling="2026-03-13 09:32:48.735841017 +0000 UTC m=+1251.465741208" observedRunningTime="2026-03-13 09:32:49.547503881 +0000 UTC m=+1252.277404072" watchObservedRunningTime="2026-03-13 09:32:49.581170631 +0000 UTC m=+1252.311070812" Mar 13 09:32:50 crc kubenswrapper[4841]: I0313 09:32:50.426395 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:32:50 crc kubenswrapper[4841]: I0313 09:32:50.515019 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-77dcd58b46-2hq68" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.068841 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-54454c48fd-qvmtx"] Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.070173 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-54454c48fd-qvmtx" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.086020 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-74f4d87c9f-bw7dr"] Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.087463 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-74f4d87c9f-bw7dr" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.107817 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-54454c48fd-qvmtx"] Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.123590 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-74f4d87c9f-bw7dr"] Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.127395 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-d9dc96f5d-k4b4n"] Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.128541 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-d9dc96f5d-k4b4n" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.165362 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-d9dc96f5d-k4b4n"] Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.243363 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc104487-8dce-43e7-8673-1e9d6a8f5704-config-data\") pod \"heat-cfnapi-54454c48fd-qvmtx\" (UID: \"fc104487-8dce-43e7-8673-1e9d6a8f5704\") " pod="openstack/heat-cfnapi-54454c48fd-qvmtx" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.243437 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv848\" (UniqueName: \"kubernetes.io/projected/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-kube-api-access-zv848\") pod \"heat-api-d9dc96f5d-k4b4n\" (UID: \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\") " pod="openstack/heat-api-d9dc96f5d-k4b4n" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.243491 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-config-data\") pod \"heat-api-d9dc96f5d-k4b4n\" (UID: \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\") " pod="openstack/heat-api-d9dc96f5d-k4b4n" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.243512 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc104487-8dce-43e7-8673-1e9d6a8f5704-combined-ca-bundle\") pod \"heat-cfnapi-54454c48fd-qvmtx\" (UID: \"fc104487-8dce-43e7-8673-1e9d6a8f5704\") " pod="openstack/heat-cfnapi-54454c48fd-qvmtx" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.243531 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ddc522-5255-4785-8d33-85a3d0e86af2-combined-ca-bundle\") pod \"heat-engine-74f4d87c9f-bw7dr\" (UID: \"f1ddc522-5255-4785-8d33-85a3d0e86af2\") " pod="openstack/heat-engine-74f4d87c9f-bw7dr" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.243550 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-combined-ca-bundle\") pod \"heat-api-d9dc96f5d-k4b4n\" (UID: \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\") " pod="openstack/heat-api-d9dc96f5d-k4b4n" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.243584 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1ddc522-5255-4785-8d33-85a3d0e86af2-config-data-custom\") pod \"heat-engine-74f4d87c9f-bw7dr\" (UID: \"f1ddc522-5255-4785-8d33-85a3d0e86af2\") " pod="openstack/heat-engine-74f4d87c9f-bw7dr" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.243603 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxbjw\" (UniqueName: \"kubernetes.io/projected/f1ddc522-5255-4785-8d33-85a3d0e86af2-kube-api-access-cxbjw\") pod \"heat-engine-74f4d87c9f-bw7dr\" (UID: \"f1ddc522-5255-4785-8d33-85a3d0e86af2\") " pod="openstack/heat-engine-74f4d87c9f-bw7dr" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.243620 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8rkq\" (UniqueName: \"kubernetes.io/projected/fc104487-8dce-43e7-8673-1e9d6a8f5704-kube-api-access-j8rkq\") pod \"heat-cfnapi-54454c48fd-qvmtx\" (UID: \"fc104487-8dce-43e7-8673-1e9d6a8f5704\") " pod="openstack/heat-cfnapi-54454c48fd-qvmtx" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.243643 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1ddc522-5255-4785-8d33-85a3d0e86af2-config-data\") pod \"heat-engine-74f4d87c9f-bw7dr\" (UID: \"f1ddc522-5255-4785-8d33-85a3d0e86af2\") " pod="openstack/heat-engine-74f4d87c9f-bw7dr" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.243658 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-config-data-custom\") pod \"heat-api-d9dc96f5d-k4b4n\" (UID: \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\") " pod="openstack/heat-api-d9dc96f5d-k4b4n" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.243687 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc104487-8dce-43e7-8673-1e9d6a8f5704-config-data-custom\") pod \"heat-cfnapi-54454c48fd-qvmtx\" (UID: \"fc104487-8dce-43e7-8673-1e9d6a8f5704\") " pod="openstack/heat-cfnapi-54454c48fd-qvmtx" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.346794 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv848\" (UniqueName: \"kubernetes.io/projected/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-kube-api-access-zv848\") pod \"heat-api-d9dc96f5d-k4b4n\" (UID: \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\") " pod="openstack/heat-api-d9dc96f5d-k4b4n" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.346901 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-config-data\") pod \"heat-api-d9dc96f5d-k4b4n\" (UID: \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\") " pod="openstack/heat-api-d9dc96f5d-k4b4n" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.346928 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc104487-8dce-43e7-8673-1e9d6a8f5704-combined-ca-bundle\") pod \"heat-cfnapi-54454c48fd-qvmtx\" (UID: \"fc104487-8dce-43e7-8673-1e9d6a8f5704\") " pod="openstack/heat-cfnapi-54454c48fd-qvmtx" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.346954 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ddc522-5255-4785-8d33-85a3d0e86af2-combined-ca-bundle\") pod \"heat-engine-74f4d87c9f-bw7dr\" (UID: \"f1ddc522-5255-4785-8d33-85a3d0e86af2\") " pod="openstack/heat-engine-74f4d87c9f-bw7dr" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.346985 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-combined-ca-bundle\") pod \"heat-api-d9dc96f5d-k4b4n\" (UID: \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\") " pod="openstack/heat-api-d9dc96f5d-k4b4n" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.347183 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1ddc522-5255-4785-8d33-85a3d0e86af2-config-data-custom\") pod \"heat-engine-74f4d87c9f-bw7dr\" (UID: \"f1ddc522-5255-4785-8d33-85a3d0e86af2\") " pod="openstack/heat-engine-74f4d87c9f-bw7dr" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.347223 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxbjw\" (UniqueName: \"kubernetes.io/projected/f1ddc522-5255-4785-8d33-85a3d0e86af2-kube-api-access-cxbjw\") pod \"heat-engine-74f4d87c9f-bw7dr\" (UID: \"f1ddc522-5255-4785-8d33-85a3d0e86af2\") " pod="openstack/heat-engine-74f4d87c9f-bw7dr" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.347253 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8rkq\" (UniqueName: \"kubernetes.io/projected/fc104487-8dce-43e7-8673-1e9d6a8f5704-kube-api-access-j8rkq\") pod \"heat-cfnapi-54454c48fd-qvmtx\" (UID: \"fc104487-8dce-43e7-8673-1e9d6a8f5704\") " pod="openstack/heat-cfnapi-54454c48fd-qvmtx" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.347301 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1ddc522-5255-4785-8d33-85a3d0e86af2-config-data\") pod \"heat-engine-74f4d87c9f-bw7dr\" (UID: \"f1ddc522-5255-4785-8d33-85a3d0e86af2\") " pod="openstack/heat-engine-74f4d87c9f-bw7dr" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.347325 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-config-data-custom\") pod \"heat-api-d9dc96f5d-k4b4n\" (UID: \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\") " pod="openstack/heat-api-d9dc96f5d-k4b4n" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.347366 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc104487-8dce-43e7-8673-1e9d6a8f5704-config-data-custom\") pod \"heat-cfnapi-54454c48fd-qvmtx\" (UID: \"fc104487-8dce-43e7-8673-1e9d6a8f5704\") " pod="openstack/heat-cfnapi-54454c48fd-qvmtx" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.347415 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc104487-8dce-43e7-8673-1e9d6a8f5704-config-data\") pod \"heat-cfnapi-54454c48fd-qvmtx\" (UID: \"fc104487-8dce-43e7-8673-1e9d6a8f5704\") " pod="openstack/heat-cfnapi-54454c48fd-qvmtx" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.354241 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ddc522-5255-4785-8d33-85a3d0e86af2-combined-ca-bundle\") pod \"heat-engine-74f4d87c9f-bw7dr\" (UID: \"f1ddc522-5255-4785-8d33-85a3d0e86af2\") " pod="openstack/heat-engine-74f4d87c9f-bw7dr" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.356433 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc104487-8dce-43e7-8673-1e9d6a8f5704-config-data\") pod \"heat-cfnapi-54454c48fd-qvmtx\" (UID: \"fc104487-8dce-43e7-8673-1e9d6a8f5704\") " pod="openstack/heat-cfnapi-54454c48fd-qvmtx" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.357878 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1ddc522-5255-4785-8d33-85a3d0e86af2-config-data-custom\") pod \"heat-engine-74f4d87c9f-bw7dr\" (UID: \"f1ddc522-5255-4785-8d33-85a3d0e86af2\") " pod="openstack/heat-engine-74f4d87c9f-bw7dr" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.360688 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-config-data\") pod \"heat-api-d9dc96f5d-k4b4n\" (UID: \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\") " pod="openstack/heat-api-d9dc96f5d-k4b4n" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.360921 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-config-data-custom\") pod \"heat-api-d9dc96f5d-k4b4n\" (UID: \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\") " pod="openstack/heat-api-d9dc96f5d-k4b4n" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.362931 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-combined-ca-bundle\") pod \"heat-api-d9dc96f5d-k4b4n\" (UID: \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\") " pod="openstack/heat-api-d9dc96f5d-k4b4n" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.364082 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc104487-8dce-43e7-8673-1e9d6a8f5704-config-data-custom\") pod \"heat-cfnapi-54454c48fd-qvmtx\" (UID: \"fc104487-8dce-43e7-8673-1e9d6a8f5704\") " pod="openstack/heat-cfnapi-54454c48fd-qvmtx" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.365822 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv848\" (UniqueName: \"kubernetes.io/projected/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-kube-api-access-zv848\") pod \"heat-api-d9dc96f5d-k4b4n\" (UID: \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\") " pod="openstack/heat-api-d9dc96f5d-k4b4n" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.373989 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1ddc522-5255-4785-8d33-85a3d0e86af2-config-data\") pod \"heat-engine-74f4d87c9f-bw7dr\" (UID: \"f1ddc522-5255-4785-8d33-85a3d0e86af2\") " pod="openstack/heat-engine-74f4d87c9f-bw7dr" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.374967 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8rkq\" (UniqueName: \"kubernetes.io/projected/fc104487-8dce-43e7-8673-1e9d6a8f5704-kube-api-access-j8rkq\") pod \"heat-cfnapi-54454c48fd-qvmtx\" (UID: \"fc104487-8dce-43e7-8673-1e9d6a8f5704\") " pod="openstack/heat-cfnapi-54454c48fd-qvmtx" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.375017 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxbjw\" (UniqueName: \"kubernetes.io/projected/f1ddc522-5255-4785-8d33-85a3d0e86af2-kube-api-access-cxbjw\") pod \"heat-engine-74f4d87c9f-bw7dr\" (UID: \"f1ddc522-5255-4785-8d33-85a3d0e86af2\") " pod="openstack/heat-engine-74f4d87c9f-bw7dr" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.383169 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc104487-8dce-43e7-8673-1e9d6a8f5704-combined-ca-bundle\") pod \"heat-cfnapi-54454c48fd-qvmtx\" (UID: \"fc104487-8dce-43e7-8673-1e9d6a8f5704\") " pod="openstack/heat-cfnapi-54454c48fd-qvmtx" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.413500 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-54454c48fd-qvmtx" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.451278 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-74f4d87c9f-bw7dr" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.475017 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-d9dc96f5d-k4b4n" Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.524893 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerName="ceilometer-central-agent" containerID="cri-o://66114ef1565934630d1ec336af2ae7fa19efe8c2fbc0d765e957ff00cf32730c" gracePeriod=30 Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.525053 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerName="proxy-httpd" containerID="cri-o://58cb98155a159b05b44da3d4b473514922242cafa94da9bc38d801f03564412f" gracePeriod=30 Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.525091 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerName="sg-core" containerID="cri-o://1bd3a4acfc656fa437003e466b7ab8f833aa3d2c1a8e8f05cf8fbd719ac74d03" gracePeriod=30 Mar 13 09:32:51 crc kubenswrapper[4841]: I0313 09:32:51.525118 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerName="ceilometer-notification-agent" containerID="cri-o://f5defaf5fb910ed285ba72c4cfa89b64e64a8a411a15614bacec8a3a760971f7" gracePeriod=30 Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.292298 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-78489f9c65-6dcg2"] Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.294814 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-78489f9c65-6dcg2" podUID="a28ce499-0db0-4fae-88ff-35a61bb3df64" containerName="heat-api" containerID="cri-o://ce58e81af6719161e5a2965021e2f761ea54ec4b73292768362474bd1d043999" gracePeriod=60 Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.318714 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-77dcd58b46-2hq68"] Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.339420 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5ff95dc669-rzhtr"] Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.341121 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.345680 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.346209 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.384474 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/595c0935-7197-4c48-be0d-8a3ad4d6442d-internal-tls-certs\") pod \"heat-api-5ff95dc669-rzhtr\" (UID: \"595c0935-7197-4c48-be0d-8a3ad4d6442d\") " pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.384596 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/595c0935-7197-4c48-be0d-8a3ad4d6442d-public-tls-certs\") pod \"heat-api-5ff95dc669-rzhtr\" (UID: \"595c0935-7197-4c48-be0d-8a3ad4d6442d\") " pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.384701 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/595c0935-7197-4c48-be0d-8a3ad4d6442d-combined-ca-bundle\") pod \"heat-api-5ff95dc669-rzhtr\" (UID: \"595c0935-7197-4c48-be0d-8a3ad4d6442d\") " pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.404515 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sk66\" (UniqueName: \"kubernetes.io/projected/595c0935-7197-4c48-be0d-8a3ad4d6442d-kube-api-access-2sk66\") pod \"heat-api-5ff95dc669-rzhtr\" (UID: \"595c0935-7197-4c48-be0d-8a3ad4d6442d\") " pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.404636 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/595c0935-7197-4c48-be0d-8a3ad4d6442d-config-data-custom\") pod \"heat-api-5ff95dc669-rzhtr\" (UID: \"595c0935-7197-4c48-be0d-8a3ad4d6442d\") " pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.404720 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/595c0935-7197-4c48-be0d-8a3ad4d6442d-config-data\") pod \"heat-api-5ff95dc669-rzhtr\" (UID: \"595c0935-7197-4c48-be0d-8a3ad4d6442d\") " pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.411426 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5ff95dc669-rzhtr"] Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.475332 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-fb97f87fc-9tb45"] Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.484558 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.489814 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.505540 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.507734 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b447cf0-3120-4329-9dbf-534fd45e70bf-combined-ca-bundle\") pod \"heat-cfnapi-fb97f87fc-9tb45\" (UID: \"6b447cf0-3120-4329-9dbf-534fd45e70bf\") " pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.507808 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sk66\" (UniqueName: \"kubernetes.io/projected/595c0935-7197-4c48-be0d-8a3ad4d6442d-kube-api-access-2sk66\") pod \"heat-api-5ff95dc669-rzhtr\" (UID: \"595c0935-7197-4c48-be0d-8a3ad4d6442d\") " pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.507860 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b447cf0-3120-4329-9dbf-534fd45e70bf-internal-tls-certs\") pod \"heat-cfnapi-fb97f87fc-9tb45\" (UID: \"6b447cf0-3120-4329-9dbf-534fd45e70bf\") " pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.507892 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/595c0935-7197-4c48-be0d-8a3ad4d6442d-config-data-custom\") pod \"heat-api-5ff95dc669-rzhtr\" (UID: \"595c0935-7197-4c48-be0d-8a3ad4d6442d\") " pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.507917 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b447cf0-3120-4329-9dbf-534fd45e70bf-config-data\") pod \"heat-cfnapi-fb97f87fc-9tb45\" (UID: \"6b447cf0-3120-4329-9dbf-534fd45e70bf\") " pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.507957 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/595c0935-7197-4c48-be0d-8a3ad4d6442d-config-data\") pod \"heat-api-5ff95dc669-rzhtr\" (UID: \"595c0935-7197-4c48-be0d-8a3ad4d6442d\") " pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.507991 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b447cf0-3120-4329-9dbf-534fd45e70bf-config-data-custom\") pod \"heat-cfnapi-fb97f87fc-9tb45\" (UID: \"6b447cf0-3120-4329-9dbf-534fd45e70bf\") " pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.508015 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/595c0935-7197-4c48-be0d-8a3ad4d6442d-internal-tls-certs\") pod \"heat-api-5ff95dc669-rzhtr\" (UID: \"595c0935-7197-4c48-be0d-8a3ad4d6442d\") " pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.508053 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/595c0935-7197-4c48-be0d-8a3ad4d6442d-public-tls-certs\") pod \"heat-api-5ff95dc669-rzhtr\" (UID: \"595c0935-7197-4c48-be0d-8a3ad4d6442d\") " pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.508092 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/595c0935-7197-4c48-be0d-8a3ad4d6442d-combined-ca-bundle\") pod \"heat-api-5ff95dc669-rzhtr\" (UID: \"595c0935-7197-4c48-be0d-8a3ad4d6442d\") " pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.508163 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mpv6\" (UniqueName: \"kubernetes.io/projected/6b447cf0-3120-4329-9dbf-534fd45e70bf-kube-api-access-7mpv6\") pod \"heat-cfnapi-fb97f87fc-9tb45\" (UID: \"6b447cf0-3120-4329-9dbf-534fd45e70bf\") " pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.508198 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b447cf0-3120-4329-9dbf-534fd45e70bf-public-tls-certs\") pod \"heat-cfnapi-fb97f87fc-9tb45\" (UID: \"6b447cf0-3120-4329-9dbf-534fd45e70bf\") " pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.527320 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/595c0935-7197-4c48-be0d-8a3ad4d6442d-internal-tls-certs\") pod \"heat-api-5ff95dc669-rzhtr\" (UID: \"595c0935-7197-4c48-be0d-8a3ad4d6442d\") " pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.532883 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/595c0935-7197-4c48-be0d-8a3ad4d6442d-config-data\") pod \"heat-api-5ff95dc669-rzhtr\" (UID: \"595c0935-7197-4c48-be0d-8a3ad4d6442d\") " pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.533585 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/595c0935-7197-4c48-be0d-8a3ad4d6442d-config-data-custom\") pod \"heat-api-5ff95dc669-rzhtr\" (UID: \"595c0935-7197-4c48-be0d-8a3ad4d6442d\") " pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.540022 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/595c0935-7197-4c48-be0d-8a3ad4d6442d-combined-ca-bundle\") pod \"heat-api-5ff95dc669-rzhtr\" (UID: \"595c0935-7197-4c48-be0d-8a3ad4d6442d\") " pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.547624 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sk66\" (UniqueName: \"kubernetes.io/projected/595c0935-7197-4c48-be0d-8a3ad4d6442d-kube-api-access-2sk66\") pod \"heat-api-5ff95dc669-rzhtr\" (UID: \"595c0935-7197-4c48-be0d-8a3ad4d6442d\") " pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.550968 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/595c0935-7197-4c48-be0d-8a3ad4d6442d-public-tls-certs\") pod \"heat-api-5ff95dc669-rzhtr\" (UID: \"595c0935-7197-4c48-be0d-8a3ad4d6442d\") " pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.587745 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-fb97f87fc-9tb45"] Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.603960 4841 generic.go:334] "Generic (PLEG): container finished" podID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerID="58cb98155a159b05b44da3d4b473514922242cafa94da9bc38d801f03564412f" exitCode=0 Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.604094 4841 generic.go:334] "Generic (PLEG): container finished" podID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerID="1bd3a4acfc656fa437003e466b7ab8f833aa3d2c1a8e8f05cf8fbd719ac74d03" exitCode=2 Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.604166 4841 generic.go:334] "Generic (PLEG): container finished" podID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerID="f5defaf5fb910ed285ba72c4cfa89b64e64a8a411a15614bacec8a3a760971f7" exitCode=0 Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.604244 4841 generic.go:334] "Generic (PLEG): container finished" podID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerID="66114ef1565934630d1ec336af2ae7fa19efe8c2fbc0d765e957ff00cf32730c" exitCode=0 Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.604611 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-77dcd58b46-2hq68" podUID="c4495f16-0265-4bb4-a725-0f1f3da3387f" containerName="heat-cfnapi" containerID="cri-o://42e05a6b89dc9c913880642ff5e05c706ffa5e68c98ab5297c060f0bfca37084" gracePeriod=60 Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.604313 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9f2904-ee6a-4450-bf9d-bf730655ab1b","Type":"ContainerDied","Data":"58cb98155a159b05b44da3d4b473514922242cafa94da9bc38d801f03564412f"} Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.604914 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9f2904-ee6a-4450-bf9d-bf730655ab1b","Type":"ContainerDied","Data":"1bd3a4acfc656fa437003e466b7ab8f833aa3d2c1a8e8f05cf8fbd719ac74d03"} Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.605011 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9f2904-ee6a-4450-bf9d-bf730655ab1b","Type":"ContainerDied","Data":"f5defaf5fb910ed285ba72c4cfa89b64e64a8a411a15614bacec8a3a760971f7"} Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.605089 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9f2904-ee6a-4450-bf9d-bf730655ab1b","Type":"ContainerDied","Data":"66114ef1565934630d1ec336af2ae7fa19efe8c2fbc0d765e957ff00cf32730c"} Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.611244 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b447cf0-3120-4329-9dbf-534fd45e70bf-config-data-custom\") pod \"heat-cfnapi-fb97f87fc-9tb45\" (UID: \"6b447cf0-3120-4329-9dbf-534fd45e70bf\") " pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.612065 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mpv6\" (UniqueName: \"kubernetes.io/projected/6b447cf0-3120-4329-9dbf-534fd45e70bf-kube-api-access-7mpv6\") pod \"heat-cfnapi-fb97f87fc-9tb45\" (UID: \"6b447cf0-3120-4329-9dbf-534fd45e70bf\") " pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.612160 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b447cf0-3120-4329-9dbf-534fd45e70bf-public-tls-certs\") pod \"heat-cfnapi-fb97f87fc-9tb45\" (UID: \"6b447cf0-3120-4329-9dbf-534fd45e70bf\") " pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.612232 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b447cf0-3120-4329-9dbf-534fd45e70bf-combined-ca-bundle\") pod \"heat-cfnapi-fb97f87fc-9tb45\" (UID: \"6b447cf0-3120-4329-9dbf-534fd45e70bf\") " pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.612362 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b447cf0-3120-4329-9dbf-534fd45e70bf-internal-tls-certs\") pod \"heat-cfnapi-fb97f87fc-9tb45\" (UID: \"6b447cf0-3120-4329-9dbf-534fd45e70bf\") " pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.612455 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b447cf0-3120-4329-9dbf-534fd45e70bf-config-data\") pod \"heat-cfnapi-fb97f87fc-9tb45\" (UID: \"6b447cf0-3120-4329-9dbf-534fd45e70bf\") " pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.627688 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b447cf0-3120-4329-9dbf-534fd45e70bf-internal-tls-certs\") pod \"heat-cfnapi-fb97f87fc-9tb45\" (UID: \"6b447cf0-3120-4329-9dbf-534fd45e70bf\") " pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.628041 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b447cf0-3120-4329-9dbf-534fd45e70bf-public-tls-certs\") pod \"heat-cfnapi-fb97f87fc-9tb45\" (UID: \"6b447cf0-3120-4329-9dbf-534fd45e70bf\") " pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.628928 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b447cf0-3120-4329-9dbf-534fd45e70bf-config-data\") pod \"heat-cfnapi-fb97f87fc-9tb45\" (UID: \"6b447cf0-3120-4329-9dbf-534fd45e70bf\") " pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.629647 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b447cf0-3120-4329-9dbf-534fd45e70bf-combined-ca-bundle\") pod \"heat-cfnapi-fb97f87fc-9tb45\" (UID: \"6b447cf0-3120-4329-9dbf-534fd45e70bf\") " pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.635540 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b447cf0-3120-4329-9dbf-534fd45e70bf-config-data-custom\") pod \"heat-cfnapi-fb97f87fc-9tb45\" (UID: \"6b447cf0-3120-4329-9dbf-534fd45e70bf\") " pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.643297 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mpv6\" (UniqueName: \"kubernetes.io/projected/6b447cf0-3120-4329-9dbf-534fd45e70bf-kube-api-access-7mpv6\") pod \"heat-cfnapi-fb97f87fc-9tb45\" (UID: \"6b447cf0-3120-4329-9dbf-534fd45e70bf\") " pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.674624 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.676293 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7bbc77c95-pfg84" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.679163 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.679349 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ef6730be-d6cf-42b1-b356-fa08748e42ef" containerName="glance-log" containerID="cri-o://08707d05ffe742c3f5c6c21e5f25a34d24910becd2d02ad5be4eab681eb31658" gracePeriod=30 Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.679484 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ef6730be-d6cf-42b1-b356-fa08748e42ef" containerName="glance-httpd" containerID="cri-o://390f7c8080013656a2d8669a107436cce0d8b4d4ef8b4bcf595508ac814735b2" gracePeriod=30 Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.705397 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:52 crc kubenswrapper[4841]: I0313 09:32:52.906424 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:53 crc kubenswrapper[4841]: I0313 09:32:53.614146 4841 generic.go:334] "Generic (PLEG): container finished" podID="c4495f16-0265-4bb4-a725-0f1f3da3387f" containerID="42e05a6b89dc9c913880642ff5e05c706ffa5e68c98ab5297c060f0bfca37084" exitCode=0 Mar 13 09:32:53 crc kubenswrapper[4841]: I0313 09:32:53.614173 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-77dcd58b46-2hq68" event={"ID":"c4495f16-0265-4bb4-a725-0f1f3da3387f","Type":"ContainerDied","Data":"42e05a6b89dc9c913880642ff5e05c706ffa5e68c98ab5297c060f0bfca37084"} Mar 13 09:32:53 crc kubenswrapper[4841]: I0313 09:32:53.616001 4841 generic.go:334] "Generic (PLEG): container finished" podID="a28ce499-0db0-4fae-88ff-35a61bb3df64" containerID="ce58e81af6719161e5a2965021e2f761ea54ec4b73292768362474bd1d043999" exitCode=0 Mar 13 09:32:53 crc kubenswrapper[4841]: I0313 09:32:53.616043 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-78489f9c65-6dcg2" event={"ID":"a28ce499-0db0-4fae-88ff-35a61bb3df64","Type":"ContainerDied","Data":"ce58e81af6719161e5a2965021e2f761ea54ec4b73292768362474bd1d043999"} Mar 13 09:32:53 crc kubenswrapper[4841]: I0313 09:32:53.618867 4841 generic.go:334] "Generic (PLEG): container finished" podID="ef6730be-d6cf-42b1-b356-fa08748e42ef" containerID="08707d05ffe742c3f5c6c21e5f25a34d24910becd2d02ad5be4eab681eb31658" exitCode=143 Mar 13 09:32:53 crc kubenswrapper[4841]: I0313 09:32:53.619694 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ef6730be-d6cf-42b1-b356-fa08748e42ef","Type":"ContainerDied","Data":"08707d05ffe742c3f5c6c21e5f25a34d24910becd2d02ad5be4eab681eb31658"} Mar 13 09:32:55 crc kubenswrapper[4841]: I0313 09:32:55.492425 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:32:55 crc kubenswrapper[4841]: I0313 09:32:55.556866 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-trm7n"] Mar 13 09:32:55 crc kubenswrapper[4841]: I0313 09:32:55.557608 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" podUID="f59c46bc-54c2-4d96-b12e-fb39ff4a7456" containerName="dnsmasq-dns" containerID="cri-o://5663b847f2f4a88c65178ec28d4db45c210f4741b15739977833145ce605fe7d" gracePeriod=10 Mar 13 09:32:55 crc kubenswrapper[4841]: I0313 09:32:55.583869 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-77dcd58b46-2hq68" podUID="c4495f16-0265-4bb4-a725-0f1f3da3387f" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.180:8000/healthcheck\": dial tcp 10.217.0.180:8000: connect: connection refused" Mar 13 09:32:55 crc kubenswrapper[4841]: I0313 09:32:55.589669 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-78489f9c65-6dcg2" podUID="a28ce499-0db0-4fae-88ff-35a61bb3df64" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.181:8004/healthcheck\": dial tcp 10.217.0.181:8004: connect: connection refused" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.005687 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-78489f9c65-6dcg2" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.182140 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a28ce499-0db0-4fae-88ff-35a61bb3df64-config-data-custom\") pod \"a28ce499-0db0-4fae-88ff-35a61bb3df64\" (UID: \"a28ce499-0db0-4fae-88ff-35a61bb3df64\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.183317 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a28ce499-0db0-4fae-88ff-35a61bb3df64-config-data\") pod \"a28ce499-0db0-4fae-88ff-35a61bb3df64\" (UID: \"a28ce499-0db0-4fae-88ff-35a61bb3df64\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.183504 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tw87\" (UniqueName: \"kubernetes.io/projected/a28ce499-0db0-4fae-88ff-35a61bb3df64-kube-api-access-5tw87\") pod \"a28ce499-0db0-4fae-88ff-35a61bb3df64\" (UID: \"a28ce499-0db0-4fae-88ff-35a61bb3df64\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.183554 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28ce499-0db0-4fae-88ff-35a61bb3df64-combined-ca-bundle\") pod \"a28ce499-0db0-4fae-88ff-35a61bb3df64\" (UID: \"a28ce499-0db0-4fae-88ff-35a61bb3df64\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.191718 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a28ce499-0db0-4fae-88ff-35a61bb3df64-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a28ce499-0db0-4fae-88ff-35a61bb3df64" (UID: "a28ce499-0db0-4fae-88ff-35a61bb3df64"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.194424 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a28ce499-0db0-4fae-88ff-35a61bb3df64-kube-api-access-5tw87" (OuterVolumeSpecName: "kube-api-access-5tw87") pod "a28ce499-0db0-4fae-88ff-35a61bb3df64" (UID: "a28ce499-0db0-4fae-88ff-35a61bb3df64"). InnerVolumeSpecName "kube-api-access-5tw87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.221276 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.256956 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a28ce499-0db0-4fae-88ff-35a61bb3df64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a28ce499-0db0-4fae-88ff-35a61bb3df64" (UID: "a28ce499-0db0-4fae-88ff-35a61bb3df64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.260378 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a28ce499-0db0-4fae-88ff-35a61bb3df64-config-data" (OuterVolumeSpecName: "config-data") pod "a28ce499-0db0-4fae-88ff-35a61bb3df64" (UID: "a28ce499-0db0-4fae-88ff-35a61bb3df64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.286855 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-config-data\") pod \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.286898 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-log-httpd\") pod \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.286948 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-scripts\") pod \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.287009 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-sg-core-conf-yaml\") pod \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.287133 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-run-httpd\") pod \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.287166 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-combined-ca-bundle\") pod \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.287210 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcrfb\" (UniqueName: \"kubernetes.io/projected/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-kube-api-access-dcrfb\") pod \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\" (UID: \"bc9f2904-ee6a-4450-bf9d-bf730655ab1b\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.287669 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tw87\" (UniqueName: \"kubernetes.io/projected/a28ce499-0db0-4fae-88ff-35a61bb3df64-kube-api-access-5tw87\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.287682 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28ce499-0db0-4fae-88ff-35a61bb3df64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.287691 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a28ce499-0db0-4fae-88ff-35a61bb3df64-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.287701 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a28ce499-0db0-4fae-88ff-35a61bb3df64-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.288118 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bc9f2904-ee6a-4450-bf9d-bf730655ab1b" (UID: "bc9f2904-ee6a-4450-bf9d-bf730655ab1b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.288674 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bc9f2904-ee6a-4450-bf9d-bf730655ab1b" (UID: "bc9f2904-ee6a-4450-bf9d-bf730655ab1b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.290448 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-scripts" (OuterVolumeSpecName: "scripts") pod "bc9f2904-ee6a-4450-bf9d-bf730655ab1b" (UID: "bc9f2904-ee6a-4450-bf9d-bf730655ab1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.313399 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-kube-api-access-dcrfb" (OuterVolumeSpecName: "kube-api-access-dcrfb") pod "bc9f2904-ee6a-4450-bf9d-bf730655ab1b" (UID: "bc9f2904-ee6a-4450-bf9d-bf730655ab1b"). InnerVolumeSpecName "kube-api-access-dcrfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.327693 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bc9f2904-ee6a-4450-bf9d-bf730655ab1b" (UID: "bc9f2904-ee6a-4450-bf9d-bf730655ab1b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.339926 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.389021 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbxtj\" (UniqueName: \"kubernetes.io/projected/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-kube-api-access-nbxtj\") pod \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.389125 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-ovsdbserver-nb\") pod \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.389196 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-ovsdbserver-sb\") pod \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.389289 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-dns-svc\") pod \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.389374 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-config\") pod \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.389422 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-dns-swift-storage-0\") pod \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\" (UID: \"f59c46bc-54c2-4d96-b12e-fb39ff4a7456\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.389894 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.389918 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.389928 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.389936 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.389947 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcrfb\" (UniqueName: \"kubernetes.io/projected/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-kube-api-access-dcrfb\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.404949 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc9f2904-ee6a-4450-bf9d-bf730655ab1b" (UID: "bc9f2904-ee6a-4450-bf9d-bf730655ab1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.405064 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-kube-api-access-nbxtj" (OuterVolumeSpecName: "kube-api-access-nbxtj") pod "f59c46bc-54c2-4d96-b12e-fb39ff4a7456" (UID: "f59c46bc-54c2-4d96-b12e-fb39ff4a7456"). InnerVolumeSpecName "kube-api-access-nbxtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.408794 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-77dcd58b46-2hq68" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.431081 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-config-data" (OuterVolumeSpecName: "config-data") pod "bc9f2904-ee6a-4450-bf9d-bf730655ab1b" (UID: "bc9f2904-ee6a-4450-bf9d-bf730655ab1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.473965 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f59c46bc-54c2-4d96-b12e-fb39ff4a7456" (UID: "f59c46bc-54c2-4d96-b12e-fb39ff4a7456"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: W0313 09:32:56.477423 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod595c0935_7197_4c48_be0d_8a3ad4d6442d.slice/crio-1d4ee340ac3385139199291ea553b7cb9a69a697e107955adbc905f56037c59f WatchSource:0}: Error finding container 1d4ee340ac3385139199291ea553b7cb9a69a697e107955adbc905f56037c59f: Status 404 returned error can't find the container with id 1d4ee340ac3385139199291ea553b7cb9a69a697e107955adbc905f56037c59f Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.478531 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5ff95dc669-rzhtr"] Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.479327 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f59c46bc-54c2-4d96-b12e-fb39ff4a7456" (UID: "f59c46bc-54c2-4d96-b12e-fb39ff4a7456"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.491313 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4495f16-0265-4bb4-a725-0f1f3da3387f-combined-ca-bundle\") pod \"c4495f16-0265-4bb4-a725-0f1f3da3387f\" (UID: \"c4495f16-0265-4bb4-a725-0f1f3da3387f\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.491399 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4495f16-0265-4bb4-a725-0f1f3da3387f-config-data-custom\") pod \"c4495f16-0265-4bb4-a725-0f1f3da3387f\" (UID: \"c4495f16-0265-4bb4-a725-0f1f3da3387f\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.491456 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4495f16-0265-4bb4-a725-0f1f3da3387f-config-data\") pod \"c4495f16-0265-4bb4-a725-0f1f3da3387f\" (UID: \"c4495f16-0265-4bb4-a725-0f1f3da3387f\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.491525 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27gcv\" (UniqueName: \"kubernetes.io/projected/c4495f16-0265-4bb4-a725-0f1f3da3387f-kube-api-access-27gcv\") pod \"c4495f16-0265-4bb4-a725-0f1f3da3387f\" (UID: \"c4495f16-0265-4bb4-a725-0f1f3da3387f\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.499880 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.499921 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.499937 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbxtj\" (UniqueName: \"kubernetes.io/projected/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-kube-api-access-nbxtj\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.499953 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc9f2904-ee6a-4450-bf9d-bf730655ab1b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.499964 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.515837 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4495f16-0265-4bb4-a725-0f1f3da3387f-kube-api-access-27gcv" (OuterVolumeSpecName: "kube-api-access-27gcv") pod "c4495f16-0265-4bb4-a725-0f1f3da3387f" (UID: "c4495f16-0265-4bb4-a725-0f1f3da3387f"). InnerVolumeSpecName "kube-api-access-27gcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.516671 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4495f16-0265-4bb4-a725-0f1f3da3387f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c4495f16-0265-4bb4-a725-0f1f3da3387f" (UID: "c4495f16-0265-4bb4-a725-0f1f3da3387f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.517045 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-config" (OuterVolumeSpecName: "config") pod "f59c46bc-54c2-4d96-b12e-fb39ff4a7456" (UID: "f59c46bc-54c2-4d96-b12e-fb39ff4a7456"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.526352 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f59c46bc-54c2-4d96-b12e-fb39ff4a7456" (UID: "f59c46bc-54c2-4d96-b12e-fb39ff4a7456"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.534520 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f59c46bc-54c2-4d96-b12e-fb39ff4a7456" (UID: "f59c46bc-54c2-4d96-b12e-fb39ff4a7456"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.542401 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.554930 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4495f16-0265-4bb4-a725-0f1f3da3387f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4495f16-0265-4bb4-a725-0f1f3da3387f" (UID: "c4495f16-0265-4bb4-a725-0f1f3da3387f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.582047 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4495f16-0265-4bb4-a725-0f1f3da3387f-config-data" (OuterVolumeSpecName: "config-data") pod "c4495f16-0265-4bb4-a725-0f1f3da3387f" (UID: "c4495f16-0265-4bb4-a725-0f1f3da3387f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.601025 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-combined-ca-bundle\") pod \"ef6730be-d6cf-42b1-b356-fa08748e42ef\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.601081 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-scripts\") pod \"ef6730be-d6cf-42b1-b356-fa08748e42ef\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.601137 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef6730be-d6cf-42b1-b356-fa08748e42ef-logs\") pod \"ef6730be-d6cf-42b1-b356-fa08748e42ef\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.601306 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef6730be-d6cf-42b1-b356-fa08748e42ef-httpd-run\") pod \"ef6730be-d6cf-42b1-b356-fa08748e42ef\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.601338 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-public-tls-certs\") pod \"ef6730be-d6cf-42b1-b356-fa08748e42ef\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.602039 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ef6730be-d6cf-42b1-b356-fa08748e42ef\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.602039 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef6730be-d6cf-42b1-b356-fa08748e42ef-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ef6730be-d6cf-42b1-b356-fa08748e42ef" (UID: "ef6730be-d6cf-42b1-b356-fa08748e42ef"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.602082 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-config-data\") pod \"ef6730be-d6cf-42b1-b356-fa08748e42ef\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.602177 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdmkf\" (UniqueName: \"kubernetes.io/projected/ef6730be-d6cf-42b1-b356-fa08748e42ef-kube-api-access-bdmkf\") pod \"ef6730be-d6cf-42b1-b356-fa08748e42ef\" (UID: \"ef6730be-d6cf-42b1-b356-fa08748e42ef\") " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.602379 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef6730be-d6cf-42b1-b356-fa08748e42ef-logs" (OuterVolumeSpecName: "logs") pod "ef6730be-d6cf-42b1-b356-fa08748e42ef" (UID: "ef6730be-d6cf-42b1-b356-fa08748e42ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.602706 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.602752 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4495f16-0265-4bb4-a725-0f1f3da3387f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.602788 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4495f16-0265-4bb4-a725-0f1f3da3387f-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.602801 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.602815 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27gcv\" (UniqueName: \"kubernetes.io/projected/c4495f16-0265-4bb4-a725-0f1f3da3387f-kube-api-access-27gcv\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.602828 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef6730be-d6cf-42b1-b356-fa08748e42ef-logs\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.602840 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f59c46bc-54c2-4d96-b12e-fb39ff4a7456-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.602852 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef6730be-d6cf-42b1-b356-fa08748e42ef-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.602863 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4495f16-0265-4bb4-a725-0f1f3da3387f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.607081 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "ef6730be-d6cf-42b1-b356-fa08748e42ef" (UID: "ef6730be-d6cf-42b1-b356-fa08748e42ef"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.608300 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-scripts" (OuterVolumeSpecName: "scripts") pod "ef6730be-d6cf-42b1-b356-fa08748e42ef" (UID: "ef6730be-d6cf-42b1-b356-fa08748e42ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.608749 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef6730be-d6cf-42b1-b356-fa08748e42ef-kube-api-access-bdmkf" (OuterVolumeSpecName: "kube-api-access-bdmkf") pod "ef6730be-d6cf-42b1-b356-fa08748e42ef" (UID: "ef6730be-d6cf-42b1-b356-fa08748e42ef"). InnerVolumeSpecName "kube-api-access-bdmkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.629651 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef6730be-d6cf-42b1-b356-fa08748e42ef" (UID: "ef6730be-d6cf-42b1-b356-fa08748e42ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.651562 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-77dcd58b46-2hq68" event={"ID":"c4495f16-0265-4bb4-a725-0f1f3da3387f","Type":"ContainerDied","Data":"563157c931c06779fdce30595cd2ae2e8bfae2637d13b1f79242ceb60ba64504"} Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.651605 4841 scope.go:117] "RemoveContainer" containerID="42e05a6b89dc9c913880642ff5e05c706ffa5e68c98ab5297c060f0bfca37084" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.651730 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-77dcd58b46-2hq68" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.654966 4841 generic.go:334] "Generic (PLEG): container finished" podID="f59c46bc-54c2-4d96-b12e-fb39ff4a7456" containerID="5663b847f2f4a88c65178ec28d4db45c210f4741b15739977833145ce605fe7d" exitCode=0 Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.655037 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" event={"ID":"f59c46bc-54c2-4d96-b12e-fb39ff4a7456","Type":"ContainerDied","Data":"5663b847f2f4a88c65178ec28d4db45c210f4741b15739977833145ce605fe7d"} Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.655071 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" event={"ID":"f59c46bc-54c2-4d96-b12e-fb39ff4a7456","Type":"ContainerDied","Data":"3dcaec3acaf47a2dcd79b757115fba319ee79a407965c906018caf475b37707c"} Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.655146 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-trm7n" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.664114 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-78489f9c65-6dcg2" event={"ID":"a28ce499-0db0-4fae-88ff-35a61bb3df64","Type":"ContainerDied","Data":"40fefcd5936bdc07ea74c3c3753deea8d120311057ef6140a8213f87d77949bc"} Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.664204 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-78489f9c65-6dcg2" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.666699 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5ff95dc669-rzhtr" event={"ID":"595c0935-7197-4c48-be0d-8a3ad4d6442d","Type":"ContainerStarted","Data":"1d4ee340ac3385139199291ea553b7cb9a69a697e107955adbc905f56037c59f"} Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.677484 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.682397 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9f2904-ee6a-4450-bf9d-bf730655ab1b","Type":"ContainerDied","Data":"2b904637aedaab5c747f9661fedda31222cbe0edeb3e6fbbeabb42958ff9b834"} Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.688058 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-config-data" (OuterVolumeSpecName: "config-data") pod "ef6730be-d6cf-42b1-b356-fa08748e42ef" (UID: "ef6730be-d6cf-42b1-b356-fa08748e42ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.691058 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ef6730be-d6cf-42b1-b356-fa08748e42ef" (UID: "ef6730be-d6cf-42b1-b356-fa08748e42ef"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.694247 4841 generic.go:334] "Generic (PLEG): container finished" podID="ef6730be-d6cf-42b1-b356-fa08748e42ef" containerID="390f7c8080013656a2d8669a107436cce0d8b4d4ef8b4bcf595508ac814735b2" exitCode=0 Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.694324 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ef6730be-d6cf-42b1-b356-fa08748e42ef","Type":"ContainerDied","Data":"390f7c8080013656a2d8669a107436cce0d8b4d4ef8b4bcf595508ac814735b2"} Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.694349 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ef6730be-d6cf-42b1-b356-fa08748e42ef","Type":"ContainerDied","Data":"275545c67d37e760b7e7d7af8e82b47daf54424a8ce5cef415bd80e89f140824"} Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.694404 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.700388 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"df3b0fd2-0003-42f1-b746-72231cfad7a0","Type":"ContainerStarted","Data":"21f80dc2ef13fcf4a80b162545cceef4071732964b2ce78b2aa957765d00dd46"} Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.704606 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.704635 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.704644 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.704691 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.704704 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef6730be-d6cf-42b1-b356-fa08748e42ef-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.704713 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdmkf\" (UniqueName: \"kubernetes.io/projected/ef6730be-d6cf-42b1-b356-fa08748e42ef-kube-api-access-bdmkf\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.741222 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6d768747c7-2ssnn" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.758211 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.574612864 podStartE2EDuration="16.758194838s" podCreationTimestamp="2026-03-13 09:32:40 +0000 UTC" firstStartedPulling="2026-03-13 09:32:41.73796064 +0000 UTC m=+1244.467860831" lastFinishedPulling="2026-03-13 09:32:55.921542614 +0000 UTC m=+1258.651442805" observedRunningTime="2026-03-13 09:32:56.722602028 +0000 UTC m=+1259.452502219" watchObservedRunningTime="2026-03-13 09:32:56.758194838 +0000 UTC m=+1259.488095029" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.761250 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.807513 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.812063 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5864fb6bd-dr6nw"] Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.812252 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5864fb6bd-dr6nw" podUID="cf5e75f4-273c-49c5-afda-9483dfcf1ff3" containerName="neutron-api" containerID="cri-o://529a9c9ded573845cf459d9f5c8b035f9c2483362f12bb2aac07150f813ae2f1" gracePeriod=30 Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.812599 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5864fb6bd-dr6nw" podUID="cf5e75f4-273c-49c5-afda-9483dfcf1ff3" containerName="neutron-httpd" containerID="cri-o://0fcd783f0b79c7ed83ed485507c30a3aaeeb91acfbc08b832607177884e12c31" gracePeriod=30 Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.875084 4841 scope.go:117] "RemoveContainer" containerID="5663b847f2f4a88c65178ec28d4db45c210f4741b15739977833145ce605fe7d" Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.880187 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-fb97f87fc-9tb45"] Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.890185 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-78489f9c65-6dcg2"] Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.898380 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-78489f9c65-6dcg2"] Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.906332 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-74f4d87c9f-bw7dr"] Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.918561 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 09:32:56 crc kubenswrapper[4841]: W0313 09:32:56.920601 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1ddc522_5255_4785_8d33_85a3d0e86af2.slice/crio-8734f277b026ae5b5161432d67436bf9b91d375ace5a0eb583921af0ea56bdd2 WatchSource:0}: Error finding container 8734f277b026ae5b5161432d67436bf9b91d375ace5a0eb583921af0ea56bdd2: Status 404 returned error can't find the container with id 8734f277b026ae5b5161432d67436bf9b91d375ace5a0eb583921af0ea56bdd2 Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.928374 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 09:32:56 crc kubenswrapper[4841]: I0313 09:32:56.988451 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-77dcd58b46-2hq68"] Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.011666 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-77dcd58b46-2hq68"] Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.030023 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-d9dc96f5d-k4b4n"] Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.050417 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.050504 4841 scope.go:117] "RemoveContainer" containerID="9598417d7c71caf6981889518592c0779ad80e9b50513981a594c0d81f777983" Mar 13 09:32:57 crc kubenswrapper[4841]: E0313 09:32:57.051670 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerName="proxy-httpd" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.051712 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerName="proxy-httpd" Mar 13 09:32:57 crc kubenswrapper[4841]: E0313 09:32:57.051747 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a28ce499-0db0-4fae-88ff-35a61bb3df64" containerName="heat-api" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.051757 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a28ce499-0db0-4fae-88ff-35a61bb3df64" containerName="heat-api" Mar 13 09:32:57 crc kubenswrapper[4841]: E0313 09:32:57.051778 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerName="ceilometer-notification-agent" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.051786 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerName="ceilometer-notification-agent" Mar 13 09:32:57 crc kubenswrapper[4841]: E0313 09:32:57.051806 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerName="sg-core" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.051814 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerName="sg-core" Mar 13 09:32:57 crc kubenswrapper[4841]: E0313 09:32:57.051832 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f59c46bc-54c2-4d96-b12e-fb39ff4a7456" containerName="dnsmasq-dns" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.051848 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f59c46bc-54c2-4d96-b12e-fb39ff4a7456" containerName="dnsmasq-dns" Mar 13 09:32:57 crc kubenswrapper[4841]: E0313 09:32:57.051885 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f59c46bc-54c2-4d96-b12e-fb39ff4a7456" containerName="init" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.051894 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f59c46bc-54c2-4d96-b12e-fb39ff4a7456" containerName="init" Mar 13 09:32:57 crc kubenswrapper[4841]: E0313 09:32:57.051914 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6730be-d6cf-42b1-b356-fa08748e42ef" containerName="glance-httpd" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.051932 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6730be-d6cf-42b1-b356-fa08748e42ef" containerName="glance-httpd" Mar 13 09:32:57 crc kubenswrapper[4841]: E0313 09:32:57.051951 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4495f16-0265-4bb4-a725-0f1f3da3387f" containerName="heat-cfnapi" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.051959 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4495f16-0265-4bb4-a725-0f1f3da3387f" containerName="heat-cfnapi" Mar 13 09:32:57 crc kubenswrapper[4841]: E0313 09:32:57.051981 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6730be-d6cf-42b1-b356-fa08748e42ef" containerName="glance-log" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.051990 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6730be-d6cf-42b1-b356-fa08748e42ef" containerName="glance-log" Mar 13 09:32:57 crc kubenswrapper[4841]: E0313 09:32:57.052015 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerName="ceilometer-central-agent" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.052023 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerName="ceilometer-central-agent" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.052672 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerName="ceilometer-central-agent" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.052716 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef6730be-d6cf-42b1-b356-fa08748e42ef" containerName="glance-httpd" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.052742 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4495f16-0265-4bb4-a725-0f1f3da3387f" containerName="heat-cfnapi" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.052754 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerName="proxy-httpd" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.052772 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerName="sg-core" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.052801 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f59c46bc-54c2-4d96-b12e-fb39ff4a7456" containerName="dnsmasq-dns" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.052893 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" containerName="ceilometer-notification-agent" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.052915 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef6730be-d6cf-42b1-b356-fa08748e42ef" containerName="glance-log" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.052939 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a28ce499-0db0-4fae-88ff-35a61bb3df64" containerName="heat-api" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.055761 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.058989 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.060095 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.098182 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-trm7n"] Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.123195 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-trm7n"] Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.124658 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.124787 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-config-data\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.124873 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-logs\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.124945 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.125013 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25srj\" (UniqueName: \"kubernetes.io/projected/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-kube-api-access-25srj\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.125094 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.125169 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-scripts\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.125309 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.136347 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.147709 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.157305 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-54454c48fd-qvmtx"] Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.188348 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.228453 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.230077 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.230152 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.230190 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-config-data\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.230216 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-logs\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.230239 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.230275 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25srj\" (UniqueName: \"kubernetes.io/projected/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-kube-api-access-25srj\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.230306 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.230333 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-scripts\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.230679 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.232036 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.232068 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.234874 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-logs\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.235156 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.235366 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.235403 4841 scope.go:117] "RemoveContainer" containerID="5663b847f2f4a88c65178ec28d4db45c210f4741b15739977833145ce605fe7d" Mar 13 09:32:57 crc kubenswrapper[4841]: E0313 09:32:57.236135 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5663b847f2f4a88c65178ec28d4db45c210f4741b15739977833145ce605fe7d\": container with ID starting with 5663b847f2f4a88c65178ec28d4db45c210f4741b15739977833145ce605fe7d not found: ID does not exist" containerID="5663b847f2f4a88c65178ec28d4db45c210f4741b15739977833145ce605fe7d" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.236183 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5663b847f2f4a88c65178ec28d4db45c210f4741b15739977833145ce605fe7d"} err="failed to get container status \"5663b847f2f4a88c65178ec28d4db45c210f4741b15739977833145ce605fe7d\": rpc error: code = NotFound desc = could not find container \"5663b847f2f4a88c65178ec28d4db45c210f4741b15739977833145ce605fe7d\": container with ID starting with 5663b847f2f4a88c65178ec28d4db45c210f4741b15739977833145ce605fe7d not found: ID does not exist" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.236209 4841 scope.go:117] "RemoveContainer" containerID="9598417d7c71caf6981889518592c0779ad80e9b50513981a594c0d81f777983" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.239579 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: E0313 09:32:57.241818 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9598417d7c71caf6981889518592c0779ad80e9b50513981a594c0d81f777983\": container with ID starting with 9598417d7c71caf6981889518592c0779ad80e9b50513981a594c0d81f777983 not found: ID does not exist" containerID="9598417d7c71caf6981889518592c0779ad80e9b50513981a594c0d81f777983" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.241855 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9598417d7c71caf6981889518592c0779ad80e9b50513981a594c0d81f777983"} err="failed to get container status \"9598417d7c71caf6981889518592c0779ad80e9b50513981a594c0d81f777983\": rpc error: code = NotFound desc = could not find container \"9598417d7c71caf6981889518592c0779ad80e9b50513981a594c0d81f777983\": container with ID starting with 9598417d7c71caf6981889518592c0779ad80e9b50513981a594c0d81f777983 not found: ID does not exist" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.241880 4841 scope.go:117] "RemoveContainer" containerID="ce58e81af6719161e5a2965021e2f761ea54ec4b73292768362474bd1d043999" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.244373 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.256856 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-config-data\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.261156 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.264012 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-scripts\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.282814 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25srj\" (UniqueName: \"kubernetes.io/projected/3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9-kube-api-access-25srj\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.286025 4841 scope.go:117] "RemoveContainer" containerID="58cb98155a159b05b44da3d4b473514922242cafa94da9bc38d801f03564412f" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.304247 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.304583 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="300a1817-5e08-4edc-afec-829b69b0e7e9" containerName="glance-log" containerID="cri-o://2255d926cf9493c05b71b77ba5a003c184fe7a6d72800a5472d5805356137e69" gracePeriod=30 Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.305183 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="300a1817-5e08-4edc-afec-829b69b0e7e9" containerName="glance-httpd" containerID="cri-o://54dfd791df3aa8deab0251901ef093886d75e89f47a1e6d147dc35492a1070f9" gracePeriod=30 Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.317697 4841 scope.go:117] "RemoveContainer" containerID="1bd3a4acfc656fa437003e466b7ab8f833aa3d2c1a8e8f05cf8fbd719ac74d03" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.331596 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/094f4da7-e278-4c06-aad4-b10985b42c76-run-httpd\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.331720 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/094f4da7-e278-4c06-aad4-b10985b42c76-log-httpd\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.331738 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-scripts\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.331773 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.331790 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-config-data\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.331807 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.331832 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k5z7\" (UniqueName: \"kubernetes.io/projected/094f4da7-e278-4c06-aad4-b10985b42c76-kube-api-access-2k5z7\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.333230 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9\") " pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.356577 4841 scope.go:117] "RemoveContainer" containerID="f5defaf5fb910ed285ba72c4cfa89b64e64a8a411a15614bacec8a3a760971f7" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.396093 4841 scope.go:117] "RemoveContainer" containerID="66114ef1565934630d1ec336af2ae7fa19efe8c2fbc0d765e957ff00cf32730c" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.404417 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.433386 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/094f4da7-e278-4c06-aad4-b10985b42c76-run-httpd\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.433500 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/094f4da7-e278-4c06-aad4-b10985b42c76-log-httpd\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.433521 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-scripts\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.433557 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.433573 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-config-data\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.433591 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.433619 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k5z7\" (UniqueName: \"kubernetes.io/projected/094f4da7-e278-4c06-aad4-b10985b42c76-kube-api-access-2k5z7\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.434408 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/094f4da7-e278-4c06-aad4-b10985b42c76-log-httpd\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.434408 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/094f4da7-e278-4c06-aad4-b10985b42c76-run-httpd\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.439957 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.440703 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.441007 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-scripts\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.441218 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-config-data\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.451567 4841 scope.go:117] "RemoveContainer" containerID="390f7c8080013656a2d8669a107436cce0d8b4d4ef8b4bcf595508ac814735b2" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.455540 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k5z7\" (UniqueName: \"kubernetes.io/projected/094f4da7-e278-4c06-aad4-b10985b42c76-kube-api-access-2k5z7\") pod \"ceilometer-0\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.496193 4841 scope.go:117] "RemoveContainer" containerID="08707d05ffe742c3f5c6c21e5f25a34d24910becd2d02ad5be4eab681eb31658" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.517456 4841 scope.go:117] "RemoveContainer" containerID="390f7c8080013656a2d8669a107436cce0d8b4d4ef8b4bcf595508ac814735b2" Mar 13 09:32:57 crc kubenswrapper[4841]: E0313 09:32:57.549145 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"390f7c8080013656a2d8669a107436cce0d8b4d4ef8b4bcf595508ac814735b2\": container with ID starting with 390f7c8080013656a2d8669a107436cce0d8b4d4ef8b4bcf595508ac814735b2 not found: ID does not exist" containerID="390f7c8080013656a2d8669a107436cce0d8b4d4ef8b4bcf595508ac814735b2" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.549189 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"390f7c8080013656a2d8669a107436cce0d8b4d4ef8b4bcf595508ac814735b2"} err="failed to get container status \"390f7c8080013656a2d8669a107436cce0d8b4d4ef8b4bcf595508ac814735b2\": rpc error: code = NotFound desc = could not find container \"390f7c8080013656a2d8669a107436cce0d8b4d4ef8b4bcf595508ac814735b2\": container with ID starting with 390f7c8080013656a2d8669a107436cce0d8b4d4ef8b4bcf595508ac814735b2 not found: ID does not exist" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.549216 4841 scope.go:117] "RemoveContainer" containerID="08707d05ffe742c3f5c6c21e5f25a34d24910becd2d02ad5be4eab681eb31658" Mar 13 09:32:57 crc kubenswrapper[4841]: E0313 09:32:57.549513 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08707d05ffe742c3f5c6c21e5f25a34d24910becd2d02ad5be4eab681eb31658\": container with ID starting with 08707d05ffe742c3f5c6c21e5f25a34d24910becd2d02ad5be4eab681eb31658 not found: ID does not exist" containerID="08707d05ffe742c3f5c6c21e5f25a34d24910becd2d02ad5be4eab681eb31658" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.549528 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08707d05ffe742c3f5c6c21e5f25a34d24910becd2d02ad5be4eab681eb31658"} err="failed to get container status \"08707d05ffe742c3f5c6c21e5f25a34d24910becd2d02ad5be4eab681eb31658\": rpc error: code = NotFound desc = could not find container \"08707d05ffe742c3f5c6c21e5f25a34d24910becd2d02ad5be4eab681eb31658\": container with ID starting with 08707d05ffe742c3f5c6c21e5f25a34d24910becd2d02ad5be4eab681eb31658 not found: ID does not exist" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.554144 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.764802 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-74f4d87c9f-bw7dr" event={"ID":"f1ddc522-5255-4785-8d33-85a3d0e86af2","Type":"ContainerStarted","Data":"1cd58a47fd47ce2bd914bada8f115ba52f24a006635cb607e78343ccdb750c89"} Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.764895 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-74f4d87c9f-bw7dr" event={"ID":"f1ddc522-5255-4785-8d33-85a3d0e86af2","Type":"ContainerStarted","Data":"8734f277b026ae5b5161432d67436bf9b91d375ace5a0eb583921af0ea56bdd2"} Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.766517 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-74f4d87c9f-bw7dr" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.794386 4841 generic.go:334] "Generic (PLEG): container finished" podID="fc104487-8dce-43e7-8673-1e9d6a8f5704" containerID="623423860edbb15b8eb3a6ee538db626672f25c5a551e6485a185bb10d17ccf2" exitCode=1 Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.794503 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-54454c48fd-qvmtx" event={"ID":"fc104487-8dce-43e7-8673-1e9d6a8f5704","Type":"ContainerDied","Data":"623423860edbb15b8eb3a6ee538db626672f25c5a551e6485a185bb10d17ccf2"} Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.794532 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-54454c48fd-qvmtx" event={"ID":"fc104487-8dce-43e7-8673-1e9d6a8f5704","Type":"ContainerStarted","Data":"37fce69edc5a33f214d2584791725111da1a26ac7688a59cdf7820122244d2e3"} Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.795450 4841 scope.go:117] "RemoveContainer" containerID="623423860edbb15b8eb3a6ee538db626672f25c5a551e6485a185bb10d17ccf2" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.818081 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-74f4d87c9f-bw7dr" podStartSLOduration=6.818053966 podStartE2EDuration="6.818053966s" podCreationTimestamp="2026-03-13 09:32:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:32:57.797779244 +0000 UTC m=+1260.527679435" watchObservedRunningTime="2026-03-13 09:32:57.818053966 +0000 UTC m=+1260.547954157" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.835856 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-fb97f87fc-9tb45" event={"ID":"6b447cf0-3120-4329-9dbf-534fd45e70bf","Type":"ContainerStarted","Data":"7845aa65fd42ded58cf695f3a1c57c47282a1967ed9d9201bd4140ee99f65797"} Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.835911 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-fb97f87fc-9tb45" event={"ID":"6b447cf0-3120-4329-9dbf-534fd45e70bf","Type":"ContainerStarted","Data":"53ce9cf9d7b8c69d6f7cf1597f00773f91280e1744d109f185ec8cda7baa7ffb"} Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.837172 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.851386 4841 generic.go:334] "Generic (PLEG): container finished" podID="795b0aeb-4a18-40aa-88b9-9e8b9130e0a6" containerID="c285b3128fbf6383019e6e3e0f885d22fd5d30a1f0c8c9fd27e09c37bb26c44c" exitCode=1 Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.851489 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-d9dc96f5d-k4b4n" event={"ID":"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6","Type":"ContainerDied","Data":"c285b3128fbf6383019e6e3e0f885d22fd5d30a1f0c8c9fd27e09c37bb26c44c"} Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.851540 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-d9dc96f5d-k4b4n" event={"ID":"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6","Type":"ContainerStarted","Data":"67628017301f38316639299b0511d893836cf175b4abf06440d30215a81670e6"} Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.852441 4841 scope.go:117] "RemoveContainer" containerID="c285b3128fbf6383019e6e3e0f885d22fd5d30a1f0c8c9fd27e09c37bb26c44c" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.867810 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-fb97f87fc-9tb45" podStartSLOduration=5.867789238 podStartE2EDuration="5.867789238s" podCreationTimestamp="2026-03-13 09:32:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:32:57.861417069 +0000 UTC m=+1260.591317260" watchObservedRunningTime="2026-03-13 09:32:57.867789238 +0000 UTC m=+1260.597689429" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.872898 4841 generic.go:334] "Generic (PLEG): container finished" podID="300a1817-5e08-4edc-afec-829b69b0e7e9" containerID="2255d926cf9493c05b71b77ba5a003c184fe7a6d72800a5472d5805356137e69" exitCode=143 Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.872983 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"300a1817-5e08-4edc-afec-829b69b0e7e9","Type":"ContainerDied","Data":"2255d926cf9493c05b71b77ba5a003c184fe7a6d72800a5472d5805356137e69"} Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.896230 4841 generic.go:334] "Generic (PLEG): container finished" podID="cf5e75f4-273c-49c5-afda-9483dfcf1ff3" containerID="0fcd783f0b79c7ed83ed485507c30a3aaeeb91acfbc08b832607177884e12c31" exitCode=0 Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.896326 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5864fb6bd-dr6nw" event={"ID":"cf5e75f4-273c-49c5-afda-9483dfcf1ff3","Type":"ContainerDied","Data":"0fcd783f0b79c7ed83ed485507c30a3aaeeb91acfbc08b832607177884e12c31"} Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.903168 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5ff95dc669-rzhtr" event={"ID":"595c0935-7197-4c48-be0d-8a3ad4d6442d","Type":"ContainerStarted","Data":"bc5387ff07fda8690bf6ea3586c8e443133b645f86c7a725e996664c60b73720"} Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.904609 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:32:57 crc kubenswrapper[4841]: I0313 09:32:57.977463 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5ff95dc669-rzhtr" podStartSLOduration=5.977442969 podStartE2EDuration="5.977442969s" podCreationTimestamp="2026-03-13 09:32:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:32:57.930805875 +0000 UTC m=+1260.660706066" watchObservedRunningTime="2026-03-13 09:32:57.977442969 +0000 UTC m=+1260.707343160" Mar 13 09:32:58 crc kubenswrapper[4841]: I0313 09:32:58.061853 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a28ce499-0db0-4fae-88ff-35a61bb3df64" path="/var/lib/kubelet/pods/a28ce499-0db0-4fae-88ff-35a61bb3df64/volumes" Mar 13 09:32:58 crc kubenswrapper[4841]: I0313 09:32:58.063174 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc9f2904-ee6a-4450-bf9d-bf730655ab1b" path="/var/lib/kubelet/pods/bc9f2904-ee6a-4450-bf9d-bf730655ab1b/volumes" Mar 13 09:32:58 crc kubenswrapper[4841]: I0313 09:32:58.063934 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4495f16-0265-4bb4-a725-0f1f3da3387f" path="/var/lib/kubelet/pods/c4495f16-0265-4bb4-a725-0f1f3da3387f/volumes" Mar 13 09:32:58 crc kubenswrapper[4841]: I0313 09:32:58.071503 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef6730be-d6cf-42b1-b356-fa08748e42ef" path="/var/lib/kubelet/pods/ef6730be-d6cf-42b1-b356-fa08748e42ef/volumes" Mar 13 09:32:58 crc kubenswrapper[4841]: I0313 09:32:58.080985 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f59c46bc-54c2-4d96-b12e-fb39ff4a7456" path="/var/lib/kubelet/pods/f59c46bc-54c2-4d96-b12e-fb39ff4a7456/volumes" Mar 13 09:32:58 crc kubenswrapper[4841]: I0313 09:32:58.082135 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 09:32:58 crc kubenswrapper[4841]: I0313 09:32:58.250297 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:32:58 crc kubenswrapper[4841]: W0313 09:32:58.258170 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod094f4da7_e278_4c06_aad4_b10985b42c76.slice/crio-3b7bf691a0c944ecd6b7a2df72cc597f1b9e22c15d1e947644de7c80aab7f1e8 WatchSource:0}: Error finding container 3b7bf691a0c944ecd6b7a2df72cc597f1b9e22c15d1e947644de7c80aab7f1e8: Status 404 returned error can't find the container with id 3b7bf691a0c944ecd6b7a2df72cc597f1b9e22c15d1e947644de7c80aab7f1e8 Mar 13 09:32:58 crc kubenswrapper[4841]: I0313 09:32:58.357224 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:32:58 crc kubenswrapper[4841]: I0313 09:32:58.936341 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9","Type":"ContainerStarted","Data":"bcbc8cd287fcc47b6e72d079e7a106f7e0ada21d7054ba1cf68083730da3ac41"} Mar 13 09:32:58 crc kubenswrapper[4841]: I0313 09:32:58.937784 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"094f4da7-e278-4c06-aad4-b10985b42c76","Type":"ContainerStarted","Data":"3b7bf691a0c944ecd6b7a2df72cc597f1b9e22c15d1e947644de7c80aab7f1e8"} Mar 13 09:32:58 crc kubenswrapper[4841]: I0313 09:32:58.940193 4841 generic.go:334] "Generic (PLEG): container finished" podID="795b0aeb-4a18-40aa-88b9-9e8b9130e0a6" containerID="b7e132b771d73d8f0e7f4f9a6f01ef0759a215125e018ca2025171dab3509a5d" exitCode=1 Mar 13 09:32:58 crc kubenswrapper[4841]: I0313 09:32:58.940311 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-d9dc96f5d-k4b4n" event={"ID":"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6","Type":"ContainerDied","Data":"b7e132b771d73d8f0e7f4f9a6f01ef0759a215125e018ca2025171dab3509a5d"} Mar 13 09:32:58 crc kubenswrapper[4841]: I0313 09:32:58.940378 4841 scope.go:117] "RemoveContainer" containerID="c285b3128fbf6383019e6e3e0f885d22fd5d30a1f0c8c9fd27e09c37bb26c44c" Mar 13 09:32:58 crc kubenswrapper[4841]: I0313 09:32:58.941197 4841 scope.go:117] "RemoveContainer" containerID="b7e132b771d73d8f0e7f4f9a6f01ef0759a215125e018ca2025171dab3509a5d" Mar 13 09:32:58 crc kubenswrapper[4841]: E0313 09:32:58.941608 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-d9dc96f5d-k4b4n_openstack(795b0aeb-4a18-40aa-88b9-9e8b9130e0a6)\"" pod="openstack/heat-api-d9dc96f5d-k4b4n" podUID="795b0aeb-4a18-40aa-88b9-9e8b9130e0a6" Mar 13 09:32:58 crc kubenswrapper[4841]: I0313 09:32:58.955558 4841 generic.go:334] "Generic (PLEG): container finished" podID="fc104487-8dce-43e7-8673-1e9d6a8f5704" containerID="b78aaed2b25c0e26612cffec796d2a0cae65866581308edcbd34a0430d96a11a" exitCode=1 Mar 13 09:32:58 crc kubenswrapper[4841]: I0313 09:32:58.955866 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-54454c48fd-qvmtx" event={"ID":"fc104487-8dce-43e7-8673-1e9d6a8f5704","Type":"ContainerDied","Data":"b78aaed2b25c0e26612cffec796d2a0cae65866581308edcbd34a0430d96a11a"} Mar 13 09:32:58 crc kubenswrapper[4841]: I0313 09:32:58.957008 4841 scope.go:117] "RemoveContainer" containerID="b78aaed2b25c0e26612cffec796d2a0cae65866581308edcbd34a0430d96a11a" Mar 13 09:32:58 crc kubenswrapper[4841]: E0313 09:32:58.957453 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-54454c48fd-qvmtx_openstack(fc104487-8dce-43e7-8673-1e9d6a8f5704)\"" pod="openstack/heat-cfnapi-54454c48fd-qvmtx" podUID="fc104487-8dce-43e7-8673-1e9d6a8f5704" Mar 13 09:32:59 crc kubenswrapper[4841]: I0313 09:32:59.058429 4841 scope.go:117] "RemoveContainer" containerID="623423860edbb15b8eb3a6ee538db626672f25c5a551e6485a185bb10d17ccf2" Mar 13 09:32:59 crc kubenswrapper[4841]: I0313 09:32:59.329131 4841 scope.go:117] "RemoveContainer" containerID="d8cf0620e340338defdbf5ee4bad93c9ac30dbf5c0a4c99a8655ab88289210bb" Mar 13 09:32:59 crc kubenswrapper[4841]: I0313 09:32:59.878103 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:32:59 crc kubenswrapper[4841]: I0313 09:32:59.926763 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-combined-ca-bundle\") pod \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " Mar 13 09:32:59 crc kubenswrapper[4841]: I0313 09:32:59.927010 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-ovndb-tls-certs\") pod \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " Mar 13 09:32:59 crc kubenswrapper[4841]: I0313 09:32:59.997017 4841 scope.go:117] "RemoveContainer" containerID="b7e132b771d73d8f0e7f4f9a6f01ef0759a215125e018ca2025171dab3509a5d" Mar 13 09:33:00 crc kubenswrapper[4841]: E0313 09:32:59.999497 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-d9dc96f5d-k4b4n_openstack(795b0aeb-4a18-40aa-88b9-9e8b9130e0a6)\"" pod="openstack/heat-api-d9dc96f5d-k4b4n" podUID="795b0aeb-4a18-40aa-88b9-9e8b9130e0a6" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.012407 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf5e75f4-273c-49c5-afda-9483dfcf1ff3" (UID: "cf5e75f4-273c-49c5-afda-9483dfcf1ff3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.015571 4841 scope.go:117] "RemoveContainer" containerID="b78aaed2b25c0e26612cffec796d2a0cae65866581308edcbd34a0430d96a11a" Mar 13 09:33:00 crc kubenswrapper[4841]: E0313 09:33:00.015793 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-54454c48fd-qvmtx_openstack(fc104487-8dce-43e7-8673-1e9d6a8f5704)\"" pod="openstack/heat-cfnapi-54454c48fd-qvmtx" podUID="fc104487-8dce-43e7-8673-1e9d6a8f5704" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.029956 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-config\") pod \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.030001 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jndnp\" (UniqueName: \"kubernetes.io/projected/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-kube-api-access-jndnp\") pod \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.030038 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-httpd-config\") pod \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\" (UID: \"cf5e75f4-273c-49c5-afda-9483dfcf1ff3\") " Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.030944 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.035154 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "cf5e75f4-273c-49c5-afda-9483dfcf1ff3" (UID: "cf5e75f4-273c-49c5-afda-9483dfcf1ff3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.036876 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "cf5e75f4-273c-49c5-afda-9483dfcf1ff3" (UID: "cf5e75f4-273c-49c5-afda-9483dfcf1ff3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.042506 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-kube-api-access-jndnp" (OuterVolumeSpecName: "kube-api-access-jndnp") pod "cf5e75f4-273c-49c5-afda-9483dfcf1ff3" (UID: "cf5e75f4-273c-49c5-afda-9483dfcf1ff3"). InnerVolumeSpecName "kube-api-access-jndnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.054813 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"094f4da7-e278-4c06-aad4-b10985b42c76","Type":"ContainerStarted","Data":"ca0d3b0eda8cd3579598eb8da552e2b0a5b77b4e2557c72bef7cb07b7f0cea19"} Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.054853 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"094f4da7-e278-4c06-aad4-b10985b42c76","Type":"ContainerStarted","Data":"3870f0d0fbd4da486d2eca09f97ce8e15e2c5081bc7472f9433f445b6a4b9136"} Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.054873 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9","Type":"ContainerStarted","Data":"4df10f08a2d7d42bb6bba8342b4a1fc78df6cd6c4855287d31f976d9773284ee"} Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.054888 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9","Type":"ContainerStarted","Data":"7a46f65554c941ece5fd485d69cb3cbb4e7676c312eb114b0558f78426f2a0be"} Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.055467 4841 generic.go:334] "Generic (PLEG): container finished" podID="cf5e75f4-273c-49c5-afda-9483dfcf1ff3" containerID="529a9c9ded573845cf459d9f5c8b035f9c2483362f12bb2aac07150f813ae2f1" exitCode=0 Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.055725 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5864fb6bd-dr6nw" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.056805 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5864fb6bd-dr6nw" event={"ID":"cf5e75f4-273c-49c5-afda-9483dfcf1ff3","Type":"ContainerDied","Data":"529a9c9ded573845cf459d9f5c8b035f9c2483362f12bb2aac07150f813ae2f1"} Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.056839 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5864fb6bd-dr6nw" event={"ID":"cf5e75f4-273c-49c5-afda-9483dfcf1ff3","Type":"ContainerDied","Data":"b0aaf2082044be834887166936aece66a364f6b23ad67cc98420425062fd1e90"} Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.056856 4841 scope.go:117] "RemoveContainer" containerID="0fcd783f0b79c7ed83ed485507c30a3aaeeb91acfbc08b832607177884e12c31" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.086164 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.086144252 podStartE2EDuration="4.086144252s" podCreationTimestamp="2026-03-13 09:32:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:33:00.071679541 +0000 UTC m=+1262.801579732" watchObservedRunningTime="2026-03-13 09:33:00.086144252 +0000 UTC m=+1262.816044443" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.090403 4841 scope.go:117] "RemoveContainer" containerID="529a9c9ded573845cf459d9f5c8b035f9c2483362f12bb2aac07150f813ae2f1" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.104947 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-config" (OuterVolumeSpecName: "config") pod "cf5e75f4-273c-49c5-afda-9483dfcf1ff3" (UID: "cf5e75f4-273c-49c5-afda-9483dfcf1ff3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.121049 4841 scope.go:117] "RemoveContainer" containerID="0fcd783f0b79c7ed83ed485507c30a3aaeeb91acfbc08b832607177884e12c31" Mar 13 09:33:00 crc kubenswrapper[4841]: E0313 09:33:00.122939 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fcd783f0b79c7ed83ed485507c30a3aaeeb91acfbc08b832607177884e12c31\": container with ID starting with 0fcd783f0b79c7ed83ed485507c30a3aaeeb91acfbc08b832607177884e12c31 not found: ID does not exist" containerID="0fcd783f0b79c7ed83ed485507c30a3aaeeb91acfbc08b832607177884e12c31" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.122967 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fcd783f0b79c7ed83ed485507c30a3aaeeb91acfbc08b832607177884e12c31"} err="failed to get container status \"0fcd783f0b79c7ed83ed485507c30a3aaeeb91acfbc08b832607177884e12c31\": rpc error: code = NotFound desc = could not find container \"0fcd783f0b79c7ed83ed485507c30a3aaeeb91acfbc08b832607177884e12c31\": container with ID starting with 0fcd783f0b79c7ed83ed485507c30a3aaeeb91acfbc08b832607177884e12c31 not found: ID does not exist" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.122987 4841 scope.go:117] "RemoveContainer" containerID="529a9c9ded573845cf459d9f5c8b035f9c2483362f12bb2aac07150f813ae2f1" Mar 13 09:33:00 crc kubenswrapper[4841]: E0313 09:33:00.123414 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"529a9c9ded573845cf459d9f5c8b035f9c2483362f12bb2aac07150f813ae2f1\": container with ID starting with 529a9c9ded573845cf459d9f5c8b035f9c2483362f12bb2aac07150f813ae2f1 not found: ID does not exist" containerID="529a9c9ded573845cf459d9f5c8b035f9c2483362f12bb2aac07150f813ae2f1" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.123477 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"529a9c9ded573845cf459d9f5c8b035f9c2483362f12bb2aac07150f813ae2f1"} err="failed to get container status \"529a9c9ded573845cf459d9f5c8b035f9c2483362f12bb2aac07150f813ae2f1\": rpc error: code = NotFound desc = could not find container \"529a9c9ded573845cf459d9f5c8b035f9c2483362f12bb2aac07150f813ae2f1\": container with ID starting with 529a9c9ded573845cf459d9f5c8b035f9c2483362f12bb2aac07150f813ae2f1 not found: ID does not exist" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.132887 4841 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.132921 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.132932 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jndnp\" (UniqueName: \"kubernetes.io/projected/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-kube-api-access-jndnp\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.132941 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf5e75f4-273c-49c5-afda-9483dfcf1ff3-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.395064 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5864fb6bd-dr6nw"] Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.443172 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5864fb6bd-dr6nw"] Mar 13 09:33:00 crc kubenswrapper[4841]: I0313 09:33:00.985195 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.070417 4841 generic.go:334] "Generic (PLEG): container finished" podID="300a1817-5e08-4edc-afec-829b69b0e7e9" containerID="54dfd791df3aa8deab0251901ef093886d75e89f47a1e6d147dc35492a1070f9" exitCode=0 Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.070501 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.070553 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"300a1817-5e08-4edc-afec-829b69b0e7e9","Type":"ContainerDied","Data":"54dfd791df3aa8deab0251901ef093886d75e89f47a1e6d147dc35492a1070f9"} Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.070602 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"300a1817-5e08-4edc-afec-829b69b0e7e9","Type":"ContainerDied","Data":"9e1330bfb437d4bbd0e86a26fb947279cb2d7c6596401c64b436c137ec452db7"} Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.070636 4841 scope.go:117] "RemoveContainer" containerID="54dfd791df3aa8deab0251901ef093886d75e89f47a1e6d147dc35492a1070f9" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.080721 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"094f4da7-e278-4c06-aad4-b10985b42c76","Type":"ContainerStarted","Data":"64a1276e570103a751e45265ef7aef31f82d447254bed0a15bae954729ff33a1"} Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.096872 4841 scope.go:117] "RemoveContainer" containerID="2255d926cf9493c05b71b77ba5a003c184fe7a6d72800a5472d5805356137e69" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.125295 4841 scope.go:117] "RemoveContainer" containerID="54dfd791df3aa8deab0251901ef093886d75e89f47a1e6d147dc35492a1070f9" Mar 13 09:33:01 crc kubenswrapper[4841]: E0313 09:33:01.127747 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54dfd791df3aa8deab0251901ef093886d75e89f47a1e6d147dc35492a1070f9\": container with ID starting with 54dfd791df3aa8deab0251901ef093886d75e89f47a1e6d147dc35492a1070f9 not found: ID does not exist" containerID="54dfd791df3aa8deab0251901ef093886d75e89f47a1e6d147dc35492a1070f9" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.127798 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54dfd791df3aa8deab0251901ef093886d75e89f47a1e6d147dc35492a1070f9"} err="failed to get container status \"54dfd791df3aa8deab0251901ef093886d75e89f47a1e6d147dc35492a1070f9\": rpc error: code = NotFound desc = could not find container \"54dfd791df3aa8deab0251901ef093886d75e89f47a1e6d147dc35492a1070f9\": container with ID starting with 54dfd791df3aa8deab0251901ef093886d75e89f47a1e6d147dc35492a1070f9 not found: ID does not exist" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.127827 4841 scope.go:117] "RemoveContainer" containerID="2255d926cf9493c05b71b77ba5a003c184fe7a6d72800a5472d5805356137e69" Mar 13 09:33:01 crc kubenswrapper[4841]: E0313 09:33:01.128431 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2255d926cf9493c05b71b77ba5a003c184fe7a6d72800a5472d5805356137e69\": container with ID starting with 2255d926cf9493c05b71b77ba5a003c184fe7a6d72800a5472d5805356137e69 not found: ID does not exist" containerID="2255d926cf9493c05b71b77ba5a003c184fe7a6d72800a5472d5805356137e69" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.128492 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2255d926cf9493c05b71b77ba5a003c184fe7a6d72800a5472d5805356137e69"} err="failed to get container status \"2255d926cf9493c05b71b77ba5a003c184fe7a6d72800a5472d5805356137e69\": rpc error: code = NotFound desc = could not find container \"2255d926cf9493c05b71b77ba5a003c184fe7a6d72800a5472d5805356137e69\": container with ID starting with 2255d926cf9493c05b71b77ba5a003c184fe7a6d72800a5472d5805356137e69 not found: ID does not exist" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.152479 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-scripts\") pod \"300a1817-5e08-4edc-afec-829b69b0e7e9\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.152736 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-combined-ca-bundle\") pod \"300a1817-5e08-4edc-afec-829b69b0e7e9\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.152868 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-config-data\") pod \"300a1817-5e08-4edc-afec-829b69b0e7e9\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.153007 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/300a1817-5e08-4edc-afec-829b69b0e7e9-logs\") pod \"300a1817-5e08-4edc-afec-829b69b0e7e9\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.153331 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/300a1817-5e08-4edc-afec-829b69b0e7e9-logs" (OuterVolumeSpecName: "logs") pod "300a1817-5e08-4edc-afec-829b69b0e7e9" (UID: "300a1817-5e08-4edc-afec-829b69b0e7e9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.153429 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"300a1817-5e08-4edc-afec-829b69b0e7e9\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.153700 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-internal-tls-certs\") pod \"300a1817-5e08-4edc-afec-829b69b0e7e9\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.153753 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/300a1817-5e08-4edc-afec-829b69b0e7e9-httpd-run\") pod \"300a1817-5e08-4edc-afec-829b69b0e7e9\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.153786 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fr49\" (UniqueName: \"kubernetes.io/projected/300a1817-5e08-4edc-afec-829b69b0e7e9-kube-api-access-5fr49\") pod \"300a1817-5e08-4edc-afec-829b69b0e7e9\" (UID: \"300a1817-5e08-4edc-afec-829b69b0e7e9\") " Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.154469 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/300a1817-5e08-4edc-afec-829b69b0e7e9-logs\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.154820 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/300a1817-5e08-4edc-afec-829b69b0e7e9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "300a1817-5e08-4edc-afec-829b69b0e7e9" (UID: "300a1817-5e08-4edc-afec-829b69b0e7e9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.158828 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/300a1817-5e08-4edc-afec-829b69b0e7e9-kube-api-access-5fr49" (OuterVolumeSpecName: "kube-api-access-5fr49") pod "300a1817-5e08-4edc-afec-829b69b0e7e9" (UID: "300a1817-5e08-4edc-afec-829b69b0e7e9"). InnerVolumeSpecName "kube-api-access-5fr49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.160401 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "300a1817-5e08-4edc-afec-829b69b0e7e9" (UID: "300a1817-5e08-4edc-afec-829b69b0e7e9"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.181255 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-scripts" (OuterVolumeSpecName: "scripts") pod "300a1817-5e08-4edc-afec-829b69b0e7e9" (UID: "300a1817-5e08-4edc-afec-829b69b0e7e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.195750 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "300a1817-5e08-4edc-afec-829b69b0e7e9" (UID: "300a1817-5e08-4edc-afec-829b69b0e7e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.256552 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.256854 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/300a1817-5e08-4edc-afec-829b69b0e7e9-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.256867 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fr49\" (UniqueName: \"kubernetes.io/projected/300a1817-5e08-4edc-afec-829b69b0e7e9-kube-api-access-5fr49\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.256879 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.256890 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.260296 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-config-data" (OuterVolumeSpecName: "config-data") pod "300a1817-5e08-4edc-afec-829b69b0e7e9" (UID: "300a1817-5e08-4edc-afec-829b69b0e7e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.278567 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "300a1817-5e08-4edc-afec-829b69b0e7e9" (UID: "300a1817-5e08-4edc-afec-829b69b0e7e9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.292952 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-xh558"] Mar 13 09:33:01 crc kubenswrapper[4841]: E0313 09:33:01.293482 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5e75f4-273c-49c5-afda-9483dfcf1ff3" containerName="neutron-api" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.293497 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5e75f4-273c-49c5-afda-9483dfcf1ff3" containerName="neutron-api" Mar 13 09:33:01 crc kubenswrapper[4841]: E0313 09:33:01.293543 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300a1817-5e08-4edc-afec-829b69b0e7e9" containerName="glance-log" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.293549 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="300a1817-5e08-4edc-afec-829b69b0e7e9" containerName="glance-log" Mar 13 09:33:01 crc kubenswrapper[4841]: E0313 09:33:01.293570 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300a1817-5e08-4edc-afec-829b69b0e7e9" containerName="glance-httpd" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.293576 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="300a1817-5e08-4edc-afec-829b69b0e7e9" containerName="glance-httpd" Mar 13 09:33:01 crc kubenswrapper[4841]: E0313 09:33:01.293613 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5e75f4-273c-49c5-afda-9483dfcf1ff3" containerName="neutron-httpd" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.293619 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5e75f4-273c-49c5-afda-9483dfcf1ff3" containerName="neutron-httpd" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.293811 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf5e75f4-273c-49c5-afda-9483dfcf1ff3" containerName="neutron-api" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.293830 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="300a1817-5e08-4edc-afec-829b69b0e7e9" containerName="glance-httpd" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.293858 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf5e75f4-273c-49c5-afda-9483dfcf1ff3" containerName="neutron-httpd" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.293877 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="300a1817-5e08-4edc-afec-829b69b0e7e9" containerName="glance-log" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.294514 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xh558" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.296704 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xh558"] Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.311478 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.362405 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.362436 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.362448 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/300a1817-5e08-4edc-afec-829b69b0e7e9-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.392299 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-482qc"] Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.393525 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-482qc" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.405672 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-f075-account-create-update-w4d9x"] Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.406939 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f075-account-create-update-w4d9x" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.409136 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.414523 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-54454c48fd-qvmtx" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.415333 4841 scope.go:117] "RemoveContainer" containerID="b78aaed2b25c0e26612cffec796d2a0cae65866581308edcbd34a0430d96a11a" Mar 13 09:33:01 crc kubenswrapper[4841]: E0313 09:33:01.415626 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-54454c48fd-qvmtx_openstack(fc104487-8dce-43e7-8673-1e9d6a8f5704)\"" pod="openstack/heat-cfnapi-54454c48fd-qvmtx" podUID="fc104487-8dce-43e7-8673-1e9d6a8f5704" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.415683 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-54454c48fd-qvmtx" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.415939 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-482qc"] Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.429142 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f075-account-create-update-w4d9x"] Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.448123 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.458619 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.464470 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmn9b\" (UniqueName: \"kubernetes.io/projected/69e96c8f-e077-499b-9b31-3984ad159364-kube-api-access-fmn9b\") pod \"nova-api-db-create-xh558\" (UID: \"69e96c8f-e077-499b-9b31-3984ad159364\") " pod="openstack/nova-api-db-create-xh558" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.465232 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69e96c8f-e077-499b-9b31-3984ad159364-operator-scripts\") pod \"nova-api-db-create-xh558\" (UID: \"69e96c8f-e077-499b-9b31-3984ad159364\") " pod="openstack/nova-api-db-create-xh558" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.475131 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-d9dc96f5d-k4b4n" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.475173 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-d9dc96f5d-k4b4n" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.475827 4841 scope.go:117] "RemoveContainer" containerID="b7e132b771d73d8f0e7f4f9a6f01ef0759a215125e018ca2025171dab3509a5d" Mar 13 09:33:01 crc kubenswrapper[4841]: E0313 09:33:01.476025 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-d9dc96f5d-k4b4n_openstack(795b0aeb-4a18-40aa-88b9-9e8b9130e0a6)\"" pod="openstack/heat-api-d9dc96f5d-k4b4n" podUID="795b0aeb-4a18-40aa-88b9-9e8b9130e0a6" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.481572 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.483062 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.485677 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.485728 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.508239 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.567597 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvvj5\" (UniqueName: \"kubernetes.io/projected/9f9980f7-5596-4002-b087-04cb53c6a78a-kube-api-access-pvvj5\") pod \"nova-cell0-db-create-482qc\" (UID: \"9f9980f7-5596-4002-b087-04cb53c6a78a\") " pod="openstack/nova-cell0-db-create-482qc" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.567644 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmn9b\" (UniqueName: \"kubernetes.io/projected/69e96c8f-e077-499b-9b31-3984ad159364-kube-api-access-fmn9b\") pod \"nova-api-db-create-xh558\" (UID: \"69e96c8f-e077-499b-9b31-3984ad159364\") " pod="openstack/nova-api-db-create-xh558" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.567707 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnr2b\" (UniqueName: \"kubernetes.io/projected/10d1db8e-6619-436b-b998-cea606b30b5c-kube-api-access-vnr2b\") pod \"nova-api-f075-account-create-update-w4d9x\" (UID: \"10d1db8e-6619-436b-b998-cea606b30b5c\") " pod="openstack/nova-api-f075-account-create-update-w4d9x" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.567758 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69e96c8f-e077-499b-9b31-3984ad159364-operator-scripts\") pod \"nova-api-db-create-xh558\" (UID: \"69e96c8f-e077-499b-9b31-3984ad159364\") " pod="openstack/nova-api-db-create-xh558" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.567820 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10d1db8e-6619-436b-b998-cea606b30b5c-operator-scripts\") pod \"nova-api-f075-account-create-update-w4d9x\" (UID: \"10d1db8e-6619-436b-b998-cea606b30b5c\") " pod="openstack/nova-api-f075-account-create-update-w4d9x" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.567868 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f9980f7-5596-4002-b087-04cb53c6a78a-operator-scripts\") pod \"nova-cell0-db-create-482qc\" (UID: \"9f9980f7-5596-4002-b087-04cb53c6a78a\") " pod="openstack/nova-cell0-db-create-482qc" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.571022 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69e96c8f-e077-499b-9b31-3984ad159364-operator-scripts\") pod \"nova-api-db-create-xh558\" (UID: \"69e96c8f-e077-499b-9b31-3984ad159364\") " pod="openstack/nova-api-db-create-xh558" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.598041 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmn9b\" (UniqueName: \"kubernetes.io/projected/69e96c8f-e077-499b-9b31-3984ad159364-kube-api-access-fmn9b\") pod \"nova-api-db-create-xh558\" (UID: \"69e96c8f-e077-499b-9b31-3984ad159364\") " pod="openstack/nova-api-db-create-xh558" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.602659 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-d577-account-create-update-pj2d2"] Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.603865 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d577-account-create-update-pj2d2" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.610292 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.615682 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xh558" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.643301 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-vx4xd"] Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.644644 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vx4xd" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.658062 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d577-account-create-update-pj2d2"] Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.673610 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.673775 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f9980f7-5596-4002-b087-04cb53c6a78a-operator-scripts\") pod \"nova-cell0-db-create-482qc\" (UID: \"9f9980f7-5596-4002-b087-04cb53c6a78a\") " pod="openstack/nova-cell0-db-create-482qc" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.673863 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.673909 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.674024 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvvj5\" (UniqueName: \"kubernetes.io/projected/9f9980f7-5596-4002-b087-04cb53c6a78a-kube-api-access-pvvj5\") pod \"nova-cell0-db-create-482qc\" (UID: \"9f9980f7-5596-4002-b087-04cb53c6a78a\") " pod="openstack/nova-cell0-db-create-482qc" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.674066 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.674092 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.674145 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.674172 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqj8z\" (UniqueName: \"kubernetes.io/projected/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-kube-api-access-gqj8z\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.674249 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnr2b\" (UniqueName: \"kubernetes.io/projected/10d1db8e-6619-436b-b998-cea606b30b5c-kube-api-access-vnr2b\") pod \"nova-api-f075-account-create-update-w4d9x\" (UID: \"10d1db8e-6619-436b-b998-cea606b30b5c\") " pod="openstack/nova-api-f075-account-create-update-w4d9x" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.674299 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.674490 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10d1db8e-6619-436b-b998-cea606b30b5c-operator-scripts\") pod \"nova-api-f075-account-create-update-w4d9x\" (UID: \"10d1db8e-6619-436b-b998-cea606b30b5c\") " pod="openstack/nova-api-f075-account-create-update-w4d9x" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.679039 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10d1db8e-6619-436b-b998-cea606b30b5c-operator-scripts\") pod \"nova-api-f075-account-create-update-w4d9x\" (UID: \"10d1db8e-6619-436b-b998-cea606b30b5c\") " pod="openstack/nova-api-f075-account-create-update-w4d9x" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.679530 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f9980f7-5596-4002-b087-04cb53c6a78a-operator-scripts\") pod \"nova-cell0-db-create-482qc\" (UID: \"9f9980f7-5596-4002-b087-04cb53c6a78a\") " pod="openstack/nova-cell0-db-create-482qc" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.679574 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-vx4xd"] Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.700810 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnr2b\" (UniqueName: \"kubernetes.io/projected/10d1db8e-6619-436b-b998-cea606b30b5c-kube-api-access-vnr2b\") pod \"nova-api-f075-account-create-update-w4d9x\" (UID: \"10d1db8e-6619-436b-b998-cea606b30b5c\") " pod="openstack/nova-api-f075-account-create-update-w4d9x" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.701665 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvvj5\" (UniqueName: \"kubernetes.io/projected/9f9980f7-5596-4002-b087-04cb53c6a78a-kube-api-access-pvvj5\") pod \"nova-cell0-db-create-482qc\" (UID: \"9f9980f7-5596-4002-b087-04cb53c6a78a\") " pod="openstack/nova-cell0-db-create-482qc" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.710727 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-482qc" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.737251 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-e604-account-create-update-cf2sr"] Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.737880 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f075-account-create-update-w4d9x" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.738924 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e604-account-create-update-cf2sr" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.741940 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.758630 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e604-account-create-update-cf2sr"] Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.776497 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.777303 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0add86-8160-48dd-a9b5-bef3edcf4810-operator-scripts\") pod \"nova-cell0-d577-account-create-update-pj2d2\" (UID: \"bf0add86-8160-48dd-a9b5-bef3edcf4810\") " pod="openstack/nova-cell0-d577-account-create-update-pj2d2" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.777351 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.777370 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.777392 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm9ck\" (UniqueName: \"kubernetes.io/projected/10eed5d3-9fc4-4322-b9ef-c299a32a94bb-kube-api-access-vm9ck\") pod \"nova-cell1-db-create-vx4xd\" (UID: \"10eed5d3-9fc4-4322-b9ef-c299a32a94bb\") " pod="openstack/nova-cell1-db-create-vx4xd" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.777415 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10eed5d3-9fc4-4322-b9ef-c299a32a94bb-operator-scripts\") pod \"nova-cell1-db-create-vx4xd\" (UID: \"10eed5d3-9fc4-4322-b9ef-c299a32a94bb\") " pod="openstack/nova-cell1-db-create-vx4xd" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.777438 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.777456 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqj8z\" (UniqueName: \"kubernetes.io/projected/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-kube-api-access-gqj8z\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.777497 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.777575 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxr49\" (UniqueName: \"kubernetes.io/projected/bf0add86-8160-48dd-a9b5-bef3edcf4810-kube-api-access-nxr49\") pod \"nova-cell0-d577-account-create-update-pj2d2\" (UID: \"bf0add86-8160-48dd-a9b5-bef3edcf4810\") " pod="openstack/nova-cell0-d577-account-create-update-pj2d2" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.777603 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.777658 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.777992 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.786247 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.786726 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.788074 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.792777 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.794624 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.806862 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.808792 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.818329 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqj8z\" (UniqueName: \"kubernetes.io/projected/c0a786f1-b124-492e-80a6-6b7df2ad7bd3-kube-api-access-gqj8z\") pod \"glance-default-internal-api-0\" (UID: \"c0a786f1-b124-492e-80a6-6b7df2ad7bd3\") " pod="openstack/glance-default-internal-api-0" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.879513 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxr49\" (UniqueName: \"kubernetes.io/projected/bf0add86-8160-48dd-a9b5-bef3edcf4810-kube-api-access-nxr49\") pod \"nova-cell0-d577-account-create-update-pj2d2\" (UID: \"bf0add86-8160-48dd-a9b5-bef3edcf4810\") " pod="openstack/nova-cell0-d577-account-create-update-pj2d2" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.879611 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2blhn\" (UniqueName: \"kubernetes.io/projected/8ea1b844-20ae-4d19-8df2-3d472391080a-kube-api-access-2blhn\") pod \"nova-cell1-e604-account-create-update-cf2sr\" (UID: \"8ea1b844-20ae-4d19-8df2-3d472391080a\") " pod="openstack/nova-cell1-e604-account-create-update-cf2sr" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.879648 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0add86-8160-48dd-a9b5-bef3edcf4810-operator-scripts\") pod \"nova-cell0-d577-account-create-update-pj2d2\" (UID: \"bf0add86-8160-48dd-a9b5-bef3edcf4810\") " pod="openstack/nova-cell0-d577-account-create-update-pj2d2" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.879681 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm9ck\" (UniqueName: \"kubernetes.io/projected/10eed5d3-9fc4-4322-b9ef-c299a32a94bb-kube-api-access-vm9ck\") pod \"nova-cell1-db-create-vx4xd\" (UID: \"10eed5d3-9fc4-4322-b9ef-c299a32a94bb\") " pod="openstack/nova-cell1-db-create-vx4xd" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.879790 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10eed5d3-9fc4-4322-b9ef-c299a32a94bb-operator-scripts\") pod \"nova-cell1-db-create-vx4xd\" (UID: \"10eed5d3-9fc4-4322-b9ef-c299a32a94bb\") " pod="openstack/nova-cell1-db-create-vx4xd" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.879844 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ea1b844-20ae-4d19-8df2-3d472391080a-operator-scripts\") pod \"nova-cell1-e604-account-create-update-cf2sr\" (UID: \"8ea1b844-20ae-4d19-8df2-3d472391080a\") " pod="openstack/nova-cell1-e604-account-create-update-cf2sr" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.881493 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0add86-8160-48dd-a9b5-bef3edcf4810-operator-scripts\") pod \"nova-cell0-d577-account-create-update-pj2d2\" (UID: \"bf0add86-8160-48dd-a9b5-bef3edcf4810\") " pod="openstack/nova-cell0-d577-account-create-update-pj2d2" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.882228 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10eed5d3-9fc4-4322-b9ef-c299a32a94bb-operator-scripts\") pod \"nova-cell1-db-create-vx4xd\" (UID: \"10eed5d3-9fc4-4322-b9ef-c299a32a94bb\") " pod="openstack/nova-cell1-db-create-vx4xd" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.901587 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxr49\" (UniqueName: \"kubernetes.io/projected/bf0add86-8160-48dd-a9b5-bef3edcf4810-kube-api-access-nxr49\") pod \"nova-cell0-d577-account-create-update-pj2d2\" (UID: \"bf0add86-8160-48dd-a9b5-bef3edcf4810\") " pod="openstack/nova-cell0-d577-account-create-update-pj2d2" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.906858 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm9ck\" (UniqueName: \"kubernetes.io/projected/10eed5d3-9fc4-4322-b9ef-c299a32a94bb-kube-api-access-vm9ck\") pod \"nova-cell1-db-create-vx4xd\" (UID: \"10eed5d3-9fc4-4322-b9ef-c299a32a94bb\") " pod="openstack/nova-cell1-db-create-vx4xd" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.939219 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d577-account-create-update-pj2d2" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.981862 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vx4xd" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.982477 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ea1b844-20ae-4d19-8df2-3d472391080a-operator-scripts\") pod \"nova-cell1-e604-account-create-update-cf2sr\" (UID: \"8ea1b844-20ae-4d19-8df2-3d472391080a\") " pod="openstack/nova-cell1-e604-account-create-update-cf2sr" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.982631 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2blhn\" (UniqueName: \"kubernetes.io/projected/8ea1b844-20ae-4d19-8df2-3d472391080a-kube-api-access-2blhn\") pod \"nova-cell1-e604-account-create-update-cf2sr\" (UID: \"8ea1b844-20ae-4d19-8df2-3d472391080a\") " pod="openstack/nova-cell1-e604-account-create-update-cf2sr" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.983609 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ea1b844-20ae-4d19-8df2-3d472391080a-operator-scripts\") pod \"nova-cell1-e604-account-create-update-cf2sr\" (UID: \"8ea1b844-20ae-4d19-8df2-3d472391080a\") " pod="openstack/nova-cell1-e604-account-create-update-cf2sr" Mar 13 09:33:01 crc kubenswrapper[4841]: I0313 09:33:01.998654 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2blhn\" (UniqueName: \"kubernetes.io/projected/8ea1b844-20ae-4d19-8df2-3d472391080a-kube-api-access-2blhn\") pod \"nova-cell1-e604-account-create-update-cf2sr\" (UID: \"8ea1b844-20ae-4d19-8df2-3d472391080a\") " pod="openstack/nova-cell1-e604-account-create-update-cf2sr" Mar 13 09:33:02 crc kubenswrapper[4841]: I0313 09:33:02.007746 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="300a1817-5e08-4edc-afec-829b69b0e7e9" path="/var/lib/kubelet/pods/300a1817-5e08-4edc-afec-829b69b0e7e9/volumes" Mar 13 09:33:02 crc kubenswrapper[4841]: I0313 09:33:02.008636 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf5e75f4-273c-49c5-afda-9483dfcf1ff3" path="/var/lib/kubelet/pods/cf5e75f4-273c-49c5-afda-9483dfcf1ff3/volumes" Mar 13 09:33:02 crc kubenswrapper[4841]: I0313 09:33:02.109834 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 09:33:02 crc kubenswrapper[4841]: I0313 09:33:02.117097 4841 scope.go:117] "RemoveContainer" containerID="b78aaed2b25c0e26612cffec796d2a0cae65866581308edcbd34a0430d96a11a" Mar 13 09:33:02 crc kubenswrapper[4841]: E0313 09:33:02.117649 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-54454c48fd-qvmtx_openstack(fc104487-8dce-43e7-8673-1e9d6a8f5704)\"" pod="openstack/heat-cfnapi-54454c48fd-qvmtx" podUID="fc104487-8dce-43e7-8673-1e9d6a8f5704" Mar 13 09:33:02 crc kubenswrapper[4841]: I0313 09:33:02.169956 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xh558"] Mar 13 09:33:02 crc kubenswrapper[4841]: I0313 09:33:02.183487 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e604-account-create-update-cf2sr" Mar 13 09:33:02 crc kubenswrapper[4841]: I0313 09:33:02.328015 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f075-account-create-update-w4d9x"] Mar 13 09:33:02 crc kubenswrapper[4841]: I0313 09:33:02.462037 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-482qc"] Mar 13 09:33:02 crc kubenswrapper[4841]: W0313 09:33:02.471076 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f9980f7_5596_4002_b087_04cb53c6a78a.slice/crio-b4fbb9552ff2e373041fc9d08ff0dc2424d9576b8f2ec2d1363bb5ab4ce23b18 WatchSource:0}: Error finding container b4fbb9552ff2e373041fc9d08ff0dc2424d9576b8f2ec2d1363bb5ab4ce23b18: Status 404 returned error can't find the container with id b4fbb9552ff2e373041fc9d08ff0dc2424d9576b8f2ec2d1363bb5ab4ce23b18 Mar 13 09:33:02 crc kubenswrapper[4841]: I0313 09:33:02.570014 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d577-account-create-update-pj2d2"] Mar 13 09:33:02 crc kubenswrapper[4841]: I0313 09:33:02.588994 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e604-account-create-update-cf2sr"] Mar 13 09:33:02 crc kubenswrapper[4841]: W0313 09:33:02.595288 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf0add86_8160_48dd_a9b5_bef3edcf4810.slice/crio-e09643f514c6a33e79cd2ecfe9539d8e841e385fd8d958b0d0c0a855a6837cb3 WatchSource:0}: Error finding container e09643f514c6a33e79cd2ecfe9539d8e841e385fd8d958b0d0c0a855a6837cb3: Status 404 returned error can't find the container with id e09643f514c6a33e79cd2ecfe9539d8e841e385fd8d958b0d0c0a855a6837cb3 Mar 13 09:33:02 crc kubenswrapper[4841]: I0313 09:33:02.641009 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-vx4xd"] Mar 13 09:33:02 crc kubenswrapper[4841]: I0313 09:33:02.800966 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 09:33:02 crc kubenswrapper[4841]: W0313 09:33:02.851952 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0a786f1_b124_492e_80a6_6b7df2ad7bd3.slice/crio-0072fb6721a04c329965929db23b39a6949bb0cc650b16cdb0fda07b97e4b3f6 WatchSource:0}: Error finding container 0072fb6721a04c329965929db23b39a6949bb0cc650b16cdb0fda07b97e4b3f6: Status 404 returned error can't find the container with id 0072fb6721a04c329965929db23b39a6949bb0cc650b16cdb0fda07b97e4b3f6 Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.158689 4841 generic.go:334] "Generic (PLEG): container finished" podID="69e96c8f-e077-499b-9b31-3984ad159364" containerID="eed7858a246be159248e8da86ec28e4493239c346e54d6ef68c4c5f664fc5c27" exitCode=0 Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.158981 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xh558" event={"ID":"69e96c8f-e077-499b-9b31-3984ad159364","Type":"ContainerDied","Data":"eed7858a246be159248e8da86ec28e4493239c346e54d6ef68c4c5f664fc5c27"} Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.159014 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xh558" event={"ID":"69e96c8f-e077-499b-9b31-3984ad159364","Type":"ContainerStarted","Data":"6a29ab650a29402e759d794a9dcc262e237d057b4d9d7c4c3a4b9673032dd9df"} Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.165668 4841 generic.go:334] "Generic (PLEG): container finished" podID="9f9980f7-5596-4002-b087-04cb53c6a78a" containerID="9a31840c42b7a0d49723495cb2ed939b71f3ba083a5e80d407235eae8c3edd48" exitCode=0 Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.165720 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-482qc" event={"ID":"9f9980f7-5596-4002-b087-04cb53c6a78a","Type":"ContainerDied","Data":"9a31840c42b7a0d49723495cb2ed939b71f3ba083a5e80d407235eae8c3edd48"} Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.165744 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-482qc" event={"ID":"9f9980f7-5596-4002-b087-04cb53c6a78a","Type":"ContainerStarted","Data":"b4fbb9552ff2e373041fc9d08ff0dc2424d9576b8f2ec2d1363bb5ab4ce23b18"} Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.168320 4841 generic.go:334] "Generic (PLEG): container finished" podID="10d1db8e-6619-436b-b998-cea606b30b5c" containerID="1964ccdabfc15bc0c412899a3f0358104247c79e05260c18dba1ff0a5ebf5aed" exitCode=0 Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.168355 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f075-account-create-update-w4d9x" event={"ID":"10d1db8e-6619-436b-b998-cea606b30b5c","Type":"ContainerDied","Data":"1964ccdabfc15bc0c412899a3f0358104247c79e05260c18dba1ff0a5ebf5aed"} Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.168370 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f075-account-create-update-w4d9x" event={"ID":"10d1db8e-6619-436b-b998-cea606b30b5c","Type":"ContainerStarted","Data":"4309d8b69881d17a759561e23b4df9e64aee65300e7c8a8b1a18cf5e9ddc9750"} Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.169234 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c0a786f1-b124-492e-80a6-6b7df2ad7bd3","Type":"ContainerStarted","Data":"0072fb6721a04c329965929db23b39a6949bb0cc650b16cdb0fda07b97e4b3f6"} Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.173385 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d577-account-create-update-pj2d2" event={"ID":"bf0add86-8160-48dd-a9b5-bef3edcf4810","Type":"ContainerStarted","Data":"e09643f514c6a33e79cd2ecfe9539d8e841e385fd8d958b0d0c0a855a6837cb3"} Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.174677 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vx4xd" event={"ID":"10eed5d3-9fc4-4322-b9ef-c299a32a94bb","Type":"ContainerStarted","Data":"8a904f26d2411e9fdd79621ea71d5e20c036bc465f7de33eaed36ce8704e1e0b"} Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.198046 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e604-account-create-update-cf2sr" event={"ID":"8ea1b844-20ae-4d19-8df2-3d472391080a","Type":"ContainerStarted","Data":"0dea0fe2e0e2b7b2c13dd010902765df3dd900a5af18bb2588cd7af6df9ca229"} Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.248859 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"094f4da7-e278-4c06-aad4-b10985b42c76","Type":"ContainerStarted","Data":"45370872ad33ca0871dcec57bc318277fc03a75bef895a1c17202a1cd11caf5d"} Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.249025 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="094f4da7-e278-4c06-aad4-b10985b42c76" containerName="ceilometer-central-agent" containerID="cri-o://3870f0d0fbd4da486d2eca09f97ce8e15e2c5081bc7472f9433f445b6a4b9136" gracePeriod=30 Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.249124 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.249463 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="094f4da7-e278-4c06-aad4-b10985b42c76" containerName="proxy-httpd" containerID="cri-o://45370872ad33ca0871dcec57bc318277fc03a75bef895a1c17202a1cd11caf5d" gracePeriod=30 Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.249523 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="094f4da7-e278-4c06-aad4-b10985b42c76" containerName="sg-core" containerID="cri-o://64a1276e570103a751e45265ef7aef31f82d447254bed0a15bae954729ff33a1" gracePeriod=30 Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.249556 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="094f4da7-e278-4c06-aad4-b10985b42c76" containerName="ceilometer-notification-agent" containerID="cri-o://ca0d3b0eda8cd3579598eb8da552e2b0a5b77b4e2557c72bef7cb07b7f0cea19" gracePeriod=30 Mar 13 09:33:03 crc kubenswrapper[4841]: I0313 09:33:03.295692 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.271494874 podStartE2EDuration="7.29566771s" podCreationTimestamp="2026-03-13 09:32:56 +0000 UTC" firstStartedPulling="2026-03-13 09:32:58.266717215 +0000 UTC m=+1260.996617406" lastFinishedPulling="2026-03-13 09:33:02.290890051 +0000 UTC m=+1265.020790242" observedRunningTime="2026-03-13 09:33:03.277160733 +0000 UTC m=+1266.007060924" watchObservedRunningTime="2026-03-13 09:33:03.29566771 +0000 UTC m=+1266.025567901" Mar 13 09:33:04 crc kubenswrapper[4841]: I0313 09:33:04.264611 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c0a786f1-b124-492e-80a6-6b7df2ad7bd3","Type":"ContainerStarted","Data":"1c3d5ae63a739b9d51ab59d81cda0892890cd957807dd1d4275aaccdda5f40d2"} Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:04.272691 4841 generic.go:334] "Generic (PLEG): container finished" podID="bf0add86-8160-48dd-a9b5-bef3edcf4810" containerID="d7633df3a24c174a8b7ec366df6fadbaeddd2340b7c92b61d63d601e06900a9b" exitCode=0 Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:04.272765 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d577-account-create-update-pj2d2" event={"ID":"bf0add86-8160-48dd-a9b5-bef3edcf4810","Type":"ContainerDied","Data":"d7633df3a24c174a8b7ec366df6fadbaeddd2340b7c92b61d63d601e06900a9b"} Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:04.277103 4841 generic.go:334] "Generic (PLEG): container finished" podID="10eed5d3-9fc4-4322-b9ef-c299a32a94bb" containerID="6f4e65adb70377c7c6401dfcaf04b226125f961b030ced1ca13c1b600a93f210" exitCode=0 Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:04.277174 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vx4xd" event={"ID":"10eed5d3-9fc4-4322-b9ef-c299a32a94bb","Type":"ContainerDied","Data":"6f4e65adb70377c7c6401dfcaf04b226125f961b030ced1ca13c1b600a93f210"} Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:04.285766 4841 generic.go:334] "Generic (PLEG): container finished" podID="8ea1b844-20ae-4d19-8df2-3d472391080a" containerID="a915ba9cf1f9f73c7697a1203419f37d5428a480b96c6b754065724dc4c5e0d7" exitCode=0 Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:04.285870 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e604-account-create-update-cf2sr" event={"ID":"8ea1b844-20ae-4d19-8df2-3d472391080a","Type":"ContainerDied","Data":"a915ba9cf1f9f73c7697a1203419f37d5428a480b96c6b754065724dc4c5e0d7"} Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:04.292565 4841 generic.go:334] "Generic (PLEG): container finished" podID="094f4da7-e278-4c06-aad4-b10985b42c76" containerID="45370872ad33ca0871dcec57bc318277fc03a75bef895a1c17202a1cd11caf5d" exitCode=0 Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:04.292859 4841 generic.go:334] "Generic (PLEG): container finished" podID="094f4da7-e278-4c06-aad4-b10985b42c76" containerID="64a1276e570103a751e45265ef7aef31f82d447254bed0a15bae954729ff33a1" exitCode=2 Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:04.292867 4841 generic.go:334] "Generic (PLEG): container finished" podID="094f4da7-e278-4c06-aad4-b10985b42c76" containerID="ca0d3b0eda8cd3579598eb8da552e2b0a5b77b4e2557c72bef7cb07b7f0cea19" exitCode=0 Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:04.293013 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"094f4da7-e278-4c06-aad4-b10985b42c76","Type":"ContainerDied","Data":"45370872ad33ca0871dcec57bc318277fc03a75bef895a1c17202a1cd11caf5d"} Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:04.293032 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"094f4da7-e278-4c06-aad4-b10985b42c76","Type":"ContainerDied","Data":"64a1276e570103a751e45265ef7aef31f82d447254bed0a15bae954729ff33a1"} Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:04.293042 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"094f4da7-e278-4c06-aad4-b10985b42c76","Type":"ContainerDied","Data":"ca0d3b0eda8cd3579598eb8da552e2b0a5b77b4e2557c72bef7cb07b7f0cea19"} Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.158601 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-794cb978db-w646s" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.297919 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5ff95dc669-rzhtr" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.318218 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c0a786f1-b124-492e-80a6-6b7df2ad7bd3","Type":"ContainerStarted","Data":"1748c5ce36bbca47e348ef540529dcbf58006959cb57c71f0cf0a92fbfa4b78c"} Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.426816 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-d9dc96f5d-k4b4n"] Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.433856 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.433835713 podStartE2EDuration="4.433835713s" podCreationTimestamp="2026-03-13 09:33:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:33:05.374089089 +0000 UTC m=+1268.103989290" watchObservedRunningTime="2026-03-13 09:33:05.433835713 +0000 UTC m=+1268.163735904" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.515775 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-482qc" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.526520 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-76777bf9d9-sc2jp" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.575363 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-fb97f87fc-9tb45" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.576848 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvvj5\" (UniqueName: \"kubernetes.io/projected/9f9980f7-5596-4002-b087-04cb53c6a78a-kube-api-access-pvvj5\") pod \"9f9980f7-5596-4002-b087-04cb53c6a78a\" (UID: \"9f9980f7-5596-4002-b087-04cb53c6a78a\") " Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.577602 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f9980f7-5596-4002-b087-04cb53c6a78a-operator-scripts\") pod \"9f9980f7-5596-4002-b087-04cb53c6a78a\" (UID: \"9f9980f7-5596-4002-b087-04cb53c6a78a\") " Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.580378 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f9980f7-5596-4002-b087-04cb53c6a78a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f9980f7-5596-4002-b087-04cb53c6a78a" (UID: "9f9980f7-5596-4002-b087-04cb53c6a78a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.593418 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f9980f7-5596-4002-b087-04cb53c6a78a-kube-api-access-pvvj5" (OuterVolumeSpecName: "kube-api-access-pvvj5") pod "9f9980f7-5596-4002-b087-04cb53c6a78a" (UID: "9f9980f7-5596-4002-b087-04cb53c6a78a"). InnerVolumeSpecName "kube-api-access-pvvj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.680389 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvvj5\" (UniqueName: \"kubernetes.io/projected/9f9980f7-5596-4002-b087-04cb53c6a78a-kube-api-access-pvvj5\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.680426 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f9980f7-5596-4002-b087-04cb53c6a78a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.699412 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-54454c48fd-qvmtx"] Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.731986 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xh558" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.757596 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f075-account-create-update-w4d9x" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.782668 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmn9b\" (UniqueName: \"kubernetes.io/projected/69e96c8f-e077-499b-9b31-3984ad159364-kube-api-access-fmn9b\") pod \"69e96c8f-e077-499b-9b31-3984ad159364\" (UID: \"69e96c8f-e077-499b-9b31-3984ad159364\") " Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.782770 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnr2b\" (UniqueName: \"kubernetes.io/projected/10d1db8e-6619-436b-b998-cea606b30b5c-kube-api-access-vnr2b\") pod \"10d1db8e-6619-436b-b998-cea606b30b5c\" (UID: \"10d1db8e-6619-436b-b998-cea606b30b5c\") " Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.782970 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69e96c8f-e077-499b-9b31-3984ad159364-operator-scripts\") pod \"69e96c8f-e077-499b-9b31-3984ad159364\" (UID: \"69e96c8f-e077-499b-9b31-3984ad159364\") " Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.782997 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10d1db8e-6619-436b-b998-cea606b30b5c-operator-scripts\") pod \"10d1db8e-6619-436b-b998-cea606b30b5c\" (UID: \"10d1db8e-6619-436b-b998-cea606b30b5c\") " Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.785621 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10d1db8e-6619-436b-b998-cea606b30b5c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "10d1db8e-6619-436b-b998-cea606b30b5c" (UID: "10d1db8e-6619-436b-b998-cea606b30b5c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.785677 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69e96c8f-e077-499b-9b31-3984ad159364-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69e96c8f-e077-499b-9b31-3984ad159364" (UID: "69e96c8f-e077-499b-9b31-3984ad159364"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.793715 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10d1db8e-6619-436b-b998-cea606b30b5c-kube-api-access-vnr2b" (OuterVolumeSpecName: "kube-api-access-vnr2b") pod "10d1db8e-6619-436b-b998-cea606b30b5c" (UID: "10d1db8e-6619-436b-b998-cea606b30b5c"). InnerVolumeSpecName "kube-api-access-vnr2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.813607 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e96c8f-e077-499b-9b31-3984ad159364-kube-api-access-fmn9b" (OuterVolumeSpecName: "kube-api-access-fmn9b") pod "69e96c8f-e077-499b-9b31-3984ad159364" (UID: "69e96c8f-e077-499b-9b31-3984ad159364"). InnerVolumeSpecName "kube-api-access-fmn9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.885461 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnr2b\" (UniqueName: \"kubernetes.io/projected/10d1db8e-6619-436b-b998-cea606b30b5c-kube-api-access-vnr2b\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.885502 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69e96c8f-e077-499b-9b31-3984ad159364-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.885518 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10d1db8e-6619-436b-b998-cea606b30b5c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:05 crc kubenswrapper[4841]: I0313 09:33:05.885530 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmn9b\" (UniqueName: \"kubernetes.io/projected/69e96c8f-e077-499b-9b31-3984ad159364-kube-api-access-fmn9b\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.112339 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d577-account-create-update-pj2d2" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.158155 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vx4xd" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.190304 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e604-account-create-update-cf2sr" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.194445 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm9ck\" (UniqueName: \"kubernetes.io/projected/10eed5d3-9fc4-4322-b9ef-c299a32a94bb-kube-api-access-vm9ck\") pod \"10eed5d3-9fc4-4322-b9ef-c299a32a94bb\" (UID: \"10eed5d3-9fc4-4322-b9ef-c299a32a94bb\") " Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.194613 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0add86-8160-48dd-a9b5-bef3edcf4810-operator-scripts\") pod \"bf0add86-8160-48dd-a9b5-bef3edcf4810\" (UID: \"bf0add86-8160-48dd-a9b5-bef3edcf4810\") " Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.194640 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxr49\" (UniqueName: \"kubernetes.io/projected/bf0add86-8160-48dd-a9b5-bef3edcf4810-kube-api-access-nxr49\") pod \"bf0add86-8160-48dd-a9b5-bef3edcf4810\" (UID: \"bf0add86-8160-48dd-a9b5-bef3edcf4810\") " Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.194677 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10eed5d3-9fc4-4322-b9ef-c299a32a94bb-operator-scripts\") pod \"10eed5d3-9fc4-4322-b9ef-c299a32a94bb\" (UID: \"10eed5d3-9fc4-4322-b9ef-c299a32a94bb\") " Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.195286 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf0add86-8160-48dd-a9b5-bef3edcf4810-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf0add86-8160-48dd-a9b5-bef3edcf4810" (UID: "bf0add86-8160-48dd-a9b5-bef3edcf4810"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.195649 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10eed5d3-9fc4-4322-b9ef-c299a32a94bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "10eed5d3-9fc4-4322-b9ef-c299a32a94bb" (UID: "10eed5d3-9fc4-4322-b9ef-c299a32a94bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.201689 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10eed5d3-9fc4-4322-b9ef-c299a32a94bb-kube-api-access-vm9ck" (OuterVolumeSpecName: "kube-api-access-vm9ck") pod "10eed5d3-9fc4-4322-b9ef-c299a32a94bb" (UID: "10eed5d3-9fc4-4322-b9ef-c299a32a94bb"). InnerVolumeSpecName "kube-api-access-vm9ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.201839 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-d9dc96f5d-k4b4n" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.201939 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf0add86-8160-48dd-a9b5-bef3edcf4810-kube-api-access-nxr49" (OuterVolumeSpecName: "kube-api-access-nxr49") pod "bf0add86-8160-48dd-a9b5-bef3edcf4810" (UID: "bf0add86-8160-48dd-a9b5-bef3edcf4810"). InnerVolumeSpecName "kube-api-access-nxr49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.296045 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ea1b844-20ae-4d19-8df2-3d472391080a-operator-scripts\") pod \"8ea1b844-20ae-4d19-8df2-3d472391080a\" (UID: \"8ea1b844-20ae-4d19-8df2-3d472391080a\") " Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.296389 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-config-data-custom\") pod \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\" (UID: \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\") " Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.296464 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-combined-ca-bundle\") pod \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\" (UID: \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\") " Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.296504 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv848\" (UniqueName: \"kubernetes.io/projected/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-kube-api-access-zv848\") pod \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\" (UID: \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\") " Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.296505 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ea1b844-20ae-4d19-8df2-3d472391080a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ea1b844-20ae-4d19-8df2-3d472391080a" (UID: "8ea1b844-20ae-4d19-8df2-3d472391080a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.296649 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2blhn\" (UniqueName: \"kubernetes.io/projected/8ea1b844-20ae-4d19-8df2-3d472391080a-kube-api-access-2blhn\") pod \"8ea1b844-20ae-4d19-8df2-3d472391080a\" (UID: \"8ea1b844-20ae-4d19-8df2-3d472391080a\") " Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.296720 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-config-data\") pod \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\" (UID: \"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6\") " Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.297082 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0add86-8160-48dd-a9b5-bef3edcf4810-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.297096 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxr49\" (UniqueName: \"kubernetes.io/projected/bf0add86-8160-48dd-a9b5-bef3edcf4810-kube-api-access-nxr49\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.297106 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10eed5d3-9fc4-4322-b9ef-c299a32a94bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.297113 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ea1b844-20ae-4d19-8df2-3d472391080a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.297122 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm9ck\" (UniqueName: \"kubernetes.io/projected/10eed5d3-9fc4-4322-b9ef-c299a32a94bb-kube-api-access-vm9ck\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.305172 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-kube-api-access-zv848" (OuterVolumeSpecName: "kube-api-access-zv848") pod "795b0aeb-4a18-40aa-88b9-9e8b9130e0a6" (UID: "795b0aeb-4a18-40aa-88b9-9e8b9130e0a6"). InnerVolumeSpecName "kube-api-access-zv848". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.305882 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "795b0aeb-4a18-40aa-88b9-9e8b9130e0a6" (UID: "795b0aeb-4a18-40aa-88b9-9e8b9130e0a6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.308373 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ea1b844-20ae-4d19-8df2-3d472391080a-kube-api-access-2blhn" (OuterVolumeSpecName: "kube-api-access-2blhn") pod "8ea1b844-20ae-4d19-8df2-3d472391080a" (UID: "8ea1b844-20ae-4d19-8df2-3d472391080a"). InnerVolumeSpecName "kube-api-access-2blhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.335245 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vx4xd" event={"ID":"10eed5d3-9fc4-4322-b9ef-c299a32a94bb","Type":"ContainerDied","Data":"8a904f26d2411e9fdd79621ea71d5e20c036bc465f7de33eaed36ce8704e1e0b"} Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.335312 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a904f26d2411e9fdd79621ea71d5e20c036bc465f7de33eaed36ce8704e1e0b" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.335364 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vx4xd" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.337136 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e604-account-create-update-cf2sr" event={"ID":"8ea1b844-20ae-4d19-8df2-3d472391080a","Type":"ContainerDied","Data":"0dea0fe2e0e2b7b2c13dd010902765df3dd900a5af18bb2588cd7af6df9ca229"} Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.337175 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dea0fe2e0e2b7b2c13dd010902765df3dd900a5af18bb2588cd7af6df9ca229" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.337246 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e604-account-create-update-cf2sr" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.341236 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-d9dc96f5d-k4b4n" event={"ID":"795b0aeb-4a18-40aa-88b9-9e8b9130e0a6","Type":"ContainerDied","Data":"67628017301f38316639299b0511d893836cf175b4abf06440d30215a81670e6"} Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.341470 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-d9dc96f5d-k4b4n" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.341505 4841 scope.go:117] "RemoveContainer" containerID="b7e132b771d73d8f0e7f4f9a6f01ef0759a215125e018ca2025171dab3509a5d" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.344965 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xh558" event={"ID":"69e96c8f-e077-499b-9b31-3984ad159364","Type":"ContainerDied","Data":"6a29ab650a29402e759d794a9dcc262e237d057b4d9d7c4c3a4b9673032dd9df"} Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.345051 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a29ab650a29402e759d794a9dcc262e237d057b4d9d7c4c3a4b9673032dd9df" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.345152 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xh558" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.348854 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-54454c48fd-qvmtx" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.357989 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "795b0aeb-4a18-40aa-88b9-9e8b9130e0a6" (UID: "795b0aeb-4a18-40aa-88b9-9e8b9130e0a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.358217 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-482qc" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.358209 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-482qc" event={"ID":"9f9980f7-5596-4002-b087-04cb53c6a78a","Type":"ContainerDied","Data":"b4fbb9552ff2e373041fc9d08ff0dc2424d9576b8f2ec2d1363bb5ab4ce23b18"} Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.358390 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4fbb9552ff2e373041fc9d08ff0dc2424d9576b8f2ec2d1363bb5ab4ce23b18" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.368553 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f075-account-create-update-w4d9x" event={"ID":"10d1db8e-6619-436b-b998-cea606b30b5c","Type":"ContainerDied","Data":"4309d8b69881d17a759561e23b4df9e64aee65300e7c8a8b1a18cf5e9ddc9750"} Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.368594 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4309d8b69881d17a759561e23b4df9e64aee65300e7c8a8b1a18cf5e9ddc9750" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.368687 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f075-account-create-update-w4d9x" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.370915 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d577-account-create-update-pj2d2" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.370954 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d577-account-create-update-pj2d2" event={"ID":"bf0add86-8160-48dd-a9b5-bef3edcf4810","Type":"ContainerDied","Data":"e09643f514c6a33e79cd2ecfe9539d8e841e385fd8d958b0d0c0a855a6837cb3"} Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.370974 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e09643f514c6a33e79cd2ecfe9539d8e841e385fd8d958b0d0c0a855a6837cb3" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.404749 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.404785 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.404799 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv848\" (UniqueName: \"kubernetes.io/projected/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-kube-api-access-zv848\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.404812 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2blhn\" (UniqueName: \"kubernetes.io/projected/8ea1b844-20ae-4d19-8df2-3d472391080a-kube-api-access-2blhn\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.407571 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-config-data" (OuterVolumeSpecName: "config-data") pod "795b0aeb-4a18-40aa-88b9-9e8b9130e0a6" (UID: "795b0aeb-4a18-40aa-88b9-9e8b9130e0a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.491305 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-794cb978db-w646s" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.506816 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc104487-8dce-43e7-8673-1e9d6a8f5704-config-data\") pod \"fc104487-8dce-43e7-8673-1e9d6a8f5704\" (UID: \"fc104487-8dce-43e7-8673-1e9d6a8f5704\") " Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.506886 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8rkq\" (UniqueName: \"kubernetes.io/projected/fc104487-8dce-43e7-8673-1e9d6a8f5704-kube-api-access-j8rkq\") pod \"fc104487-8dce-43e7-8673-1e9d6a8f5704\" (UID: \"fc104487-8dce-43e7-8673-1e9d6a8f5704\") " Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.506921 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc104487-8dce-43e7-8673-1e9d6a8f5704-config-data-custom\") pod \"fc104487-8dce-43e7-8673-1e9d6a8f5704\" (UID: \"fc104487-8dce-43e7-8673-1e9d6a8f5704\") " Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.506942 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc104487-8dce-43e7-8673-1e9d6a8f5704-combined-ca-bundle\") pod \"fc104487-8dce-43e7-8673-1e9d6a8f5704\" (UID: \"fc104487-8dce-43e7-8673-1e9d6a8f5704\") " Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.508181 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.512181 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc104487-8dce-43e7-8673-1e9d6a8f5704-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fc104487-8dce-43e7-8673-1e9d6a8f5704" (UID: "fc104487-8dce-43e7-8673-1e9d6a8f5704"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.517397 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc104487-8dce-43e7-8673-1e9d6a8f5704-kube-api-access-j8rkq" (OuterVolumeSpecName: "kube-api-access-j8rkq") pod "fc104487-8dce-43e7-8673-1e9d6a8f5704" (UID: "fc104487-8dce-43e7-8673-1e9d6a8f5704"). InnerVolumeSpecName "kube-api-access-j8rkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.609799 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8rkq\" (UniqueName: \"kubernetes.io/projected/fc104487-8dce-43e7-8673-1e9d6a8f5704-kube-api-access-j8rkq\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.609833 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc104487-8dce-43e7-8673-1e9d6a8f5704-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.610335 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc104487-8dce-43e7-8673-1e9d6a8f5704-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc104487-8dce-43e7-8673-1e9d6a8f5704" (UID: "fc104487-8dce-43e7-8673-1e9d6a8f5704"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.613032 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6969d7c4d8-xrfbc"] Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.613338 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6969d7c4d8-xrfbc" podUID="eb74819f-9ae9-498b-88b5-f0fcaf598409" containerName="placement-log" containerID="cri-o://a6090fccb088678bc444a2049865222652ab3da7d036cb10796ce5e518630144" gracePeriod=30 Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.614198 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6969d7c4d8-xrfbc" podUID="eb74819f-9ae9-498b-88b5-f0fcaf598409" containerName="placement-api" containerID="cri-o://7b7d1f6bad0b269af33da1c7ffa7a0ede51c42245e8ecce135496e72107d0e3e" gracePeriod=30 Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.644059 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc104487-8dce-43e7-8673-1e9d6a8f5704-config-data" (OuterVolumeSpecName: "config-data") pod "fc104487-8dce-43e7-8673-1e9d6a8f5704" (UID: "fc104487-8dce-43e7-8673-1e9d6a8f5704"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.704901 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-d9dc96f5d-k4b4n"] Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.711122 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc104487-8dce-43e7-8673-1e9d6a8f5704-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.711146 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc104487-8dce-43e7-8673-1e9d6a8f5704-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:06 crc kubenswrapper[4841]: I0313 09:33:06.715963 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-d9dc96f5d-k4b4n"] Mar 13 09:33:07 crc kubenswrapper[4841]: I0313 09:33:07.384224 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-54454c48fd-qvmtx" Mar 13 09:33:07 crc kubenswrapper[4841]: I0313 09:33:07.384180 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-54454c48fd-qvmtx" event={"ID":"fc104487-8dce-43e7-8673-1e9d6a8f5704","Type":"ContainerDied","Data":"37fce69edc5a33f214d2584791725111da1a26ac7688a59cdf7820122244d2e3"} Mar 13 09:33:07 crc kubenswrapper[4841]: I0313 09:33:07.384306 4841 scope.go:117] "RemoveContainer" containerID="b78aaed2b25c0e26612cffec796d2a0cae65866581308edcbd34a0430d96a11a" Mar 13 09:33:07 crc kubenswrapper[4841]: I0313 09:33:07.390378 4841 generic.go:334] "Generic (PLEG): container finished" podID="eb74819f-9ae9-498b-88b5-f0fcaf598409" containerID="a6090fccb088678bc444a2049865222652ab3da7d036cb10796ce5e518630144" exitCode=143 Mar 13 09:33:07 crc kubenswrapper[4841]: I0313 09:33:07.390432 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6969d7c4d8-xrfbc" event={"ID":"eb74819f-9ae9-498b-88b5-f0fcaf598409","Type":"ContainerDied","Data":"a6090fccb088678bc444a2049865222652ab3da7d036cb10796ce5e518630144"} Mar 13 09:33:07 crc kubenswrapper[4841]: I0313 09:33:07.406552 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 09:33:07 crc kubenswrapper[4841]: I0313 09:33:07.407528 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 09:33:07 crc kubenswrapper[4841]: I0313 09:33:07.452891 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 09:33:07 crc kubenswrapper[4841]: I0313 09:33:07.462826 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 09:33:07 crc kubenswrapper[4841]: I0313 09:33:07.467154 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-54454c48fd-qvmtx"] Mar 13 09:33:07 crc kubenswrapper[4841]: I0313 09:33:07.475700 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-54454c48fd-qvmtx"] Mar 13 09:33:08 crc kubenswrapper[4841]: I0313 09:33:08.005153 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="795b0aeb-4a18-40aa-88b9-9e8b9130e0a6" path="/var/lib/kubelet/pods/795b0aeb-4a18-40aa-88b9-9e8b9130e0a6/volumes" Mar 13 09:33:08 crc kubenswrapper[4841]: I0313 09:33:08.005994 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc104487-8dce-43e7-8673-1e9d6a8f5704" path="/var/lib/kubelet/pods/fc104487-8dce-43e7-8673-1e9d6a8f5704/volumes" Mar 13 09:33:08 crc kubenswrapper[4841]: I0313 09:33:08.411330 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 09:33:08 crc kubenswrapper[4841]: I0313 09:33:08.411402 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.289222 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.397001 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb74819f-9ae9-498b-88b5-f0fcaf598409-logs\") pod \"eb74819f-9ae9-498b-88b5-f0fcaf598409\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.397140 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b97q7\" (UniqueName: \"kubernetes.io/projected/eb74819f-9ae9-498b-88b5-f0fcaf598409-kube-api-access-b97q7\") pod \"eb74819f-9ae9-498b-88b5-f0fcaf598409\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.397190 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-scripts\") pod \"eb74819f-9ae9-498b-88b5-f0fcaf598409\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.397364 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-internal-tls-certs\") pod \"eb74819f-9ae9-498b-88b5-f0fcaf598409\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.397444 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-config-data\") pod \"eb74819f-9ae9-498b-88b5-f0fcaf598409\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.397480 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-combined-ca-bundle\") pod \"eb74819f-9ae9-498b-88b5-f0fcaf598409\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.397528 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-public-tls-certs\") pod \"eb74819f-9ae9-498b-88b5-f0fcaf598409\" (UID: \"eb74819f-9ae9-498b-88b5-f0fcaf598409\") " Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.398307 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb74819f-9ae9-498b-88b5-f0fcaf598409-logs" (OuterVolumeSpecName: "logs") pod "eb74819f-9ae9-498b-88b5-f0fcaf598409" (UID: "eb74819f-9ae9-498b-88b5-f0fcaf598409"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.404580 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb74819f-9ae9-498b-88b5-f0fcaf598409-kube-api-access-b97q7" (OuterVolumeSpecName: "kube-api-access-b97q7") pod "eb74819f-9ae9-498b-88b5-f0fcaf598409" (UID: "eb74819f-9ae9-498b-88b5-f0fcaf598409"). InnerVolumeSpecName "kube-api-access-b97q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.404613 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-scripts" (OuterVolumeSpecName: "scripts") pod "eb74819f-9ae9-498b-88b5-f0fcaf598409" (UID: "eb74819f-9ae9-498b-88b5-f0fcaf598409"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.465234 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6969d7c4d8-xrfbc" event={"ID":"eb74819f-9ae9-498b-88b5-f0fcaf598409","Type":"ContainerDied","Data":"7b7d1f6bad0b269af33da1c7ffa7a0ede51c42245e8ecce135496e72107d0e3e"} Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.465611 4841 scope.go:117] "RemoveContainer" containerID="7b7d1f6bad0b269af33da1c7ffa7a0ede51c42245e8ecce135496e72107d0e3e" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.465846 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6969d7c4d8-xrfbc" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.466291 4841 generic.go:334] "Generic (PLEG): container finished" podID="eb74819f-9ae9-498b-88b5-f0fcaf598409" containerID="7b7d1f6bad0b269af33da1c7ffa7a0ede51c42245e8ecce135496e72107d0e3e" exitCode=0 Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.466327 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6969d7c4d8-xrfbc" event={"ID":"eb74819f-9ae9-498b-88b5-f0fcaf598409","Type":"ContainerDied","Data":"4d30927176cd1ac75a6dcb29675247f84ec9e1c6bcd222054d8bcce8eef7482e"} Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.495240 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb74819f-9ae9-498b-88b5-f0fcaf598409" (UID: "eb74819f-9ae9-498b-88b5-f0fcaf598409"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.497661 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-config-data" (OuterVolumeSpecName: "config-data") pod "eb74819f-9ae9-498b-88b5-f0fcaf598409" (UID: "eb74819f-9ae9-498b-88b5-f0fcaf598409"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.500489 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb74819f-9ae9-498b-88b5-f0fcaf598409-logs\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.500515 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b97q7\" (UniqueName: \"kubernetes.io/projected/eb74819f-9ae9-498b-88b5-f0fcaf598409-kube-api-access-b97q7\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.500526 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.500535 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.500546 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.535231 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eb74819f-9ae9-498b-88b5-f0fcaf598409" (UID: "eb74819f-9ae9-498b-88b5-f0fcaf598409"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.556037 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eb74819f-9ae9-498b-88b5-f0fcaf598409" (UID: "eb74819f-9ae9-498b-88b5-f0fcaf598409"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.597333 4841 scope.go:117] "RemoveContainer" containerID="a6090fccb088678bc444a2049865222652ab3da7d036cb10796ce5e518630144" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.602101 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.602138 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb74819f-9ae9-498b-88b5-f0fcaf598409-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.625672 4841 scope.go:117] "RemoveContainer" containerID="7b7d1f6bad0b269af33da1c7ffa7a0ede51c42245e8ecce135496e72107d0e3e" Mar 13 09:33:10 crc kubenswrapper[4841]: E0313 09:33:10.626025 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b7d1f6bad0b269af33da1c7ffa7a0ede51c42245e8ecce135496e72107d0e3e\": container with ID starting with 7b7d1f6bad0b269af33da1c7ffa7a0ede51c42245e8ecce135496e72107d0e3e not found: ID does not exist" containerID="7b7d1f6bad0b269af33da1c7ffa7a0ede51c42245e8ecce135496e72107d0e3e" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.626056 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b7d1f6bad0b269af33da1c7ffa7a0ede51c42245e8ecce135496e72107d0e3e"} err="failed to get container status \"7b7d1f6bad0b269af33da1c7ffa7a0ede51c42245e8ecce135496e72107d0e3e\": rpc error: code = NotFound desc = could not find container \"7b7d1f6bad0b269af33da1c7ffa7a0ede51c42245e8ecce135496e72107d0e3e\": container with ID starting with 7b7d1f6bad0b269af33da1c7ffa7a0ede51c42245e8ecce135496e72107d0e3e not found: ID does not exist" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.626082 4841 scope.go:117] "RemoveContainer" containerID="a6090fccb088678bc444a2049865222652ab3da7d036cb10796ce5e518630144" Mar 13 09:33:10 crc kubenswrapper[4841]: E0313 09:33:10.626521 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6090fccb088678bc444a2049865222652ab3da7d036cb10796ce5e518630144\": container with ID starting with a6090fccb088678bc444a2049865222652ab3da7d036cb10796ce5e518630144 not found: ID does not exist" containerID="a6090fccb088678bc444a2049865222652ab3da7d036cb10796ce5e518630144" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.626553 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6090fccb088678bc444a2049865222652ab3da7d036cb10796ce5e518630144"} err="failed to get container status \"a6090fccb088678bc444a2049865222652ab3da7d036cb10796ce5e518630144\": rpc error: code = NotFound desc = could not find container \"a6090fccb088678bc444a2049865222652ab3da7d036cb10796ce5e518630144\": container with ID starting with a6090fccb088678bc444a2049865222652ab3da7d036cb10796ce5e518630144 not found: ID does not exist" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.809368 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6969d7c4d8-xrfbc"] Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.817754 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6969d7c4d8-xrfbc"] Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.846734 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.846917 4841 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 09:33:10 crc kubenswrapper[4841]: I0313 09:33:10.897132 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.493959 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-74f4d87c9f-bw7dr" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.545086 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-76777bf9d9-sc2jp"] Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.545328 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-76777bf9d9-sc2jp" podUID="9d53e19b-a3e7-40de-a06b-d6e1c0def922" containerName="heat-engine" containerID="cri-o://ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f" gracePeriod=60 Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.949228 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-24ghd"] Mar 13 09:33:11 crc kubenswrapper[4841]: E0313 09:33:11.949590 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795b0aeb-4a18-40aa-88b9-9e8b9130e0a6" containerName="heat-api" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.949608 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="795b0aeb-4a18-40aa-88b9-9e8b9130e0a6" containerName="heat-api" Mar 13 09:33:11 crc kubenswrapper[4841]: E0313 09:33:11.949617 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb74819f-9ae9-498b-88b5-f0fcaf598409" containerName="placement-log" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.949624 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb74819f-9ae9-498b-88b5-f0fcaf598409" containerName="placement-log" Mar 13 09:33:11 crc kubenswrapper[4841]: E0313 09:33:11.949635 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf0add86-8160-48dd-a9b5-bef3edcf4810" containerName="mariadb-account-create-update" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.949641 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0add86-8160-48dd-a9b5-bef3edcf4810" containerName="mariadb-account-create-update" Mar 13 09:33:11 crc kubenswrapper[4841]: E0313 09:33:11.949656 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc104487-8dce-43e7-8673-1e9d6a8f5704" containerName="heat-cfnapi" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.949661 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc104487-8dce-43e7-8673-1e9d6a8f5704" containerName="heat-cfnapi" Mar 13 09:33:11 crc kubenswrapper[4841]: E0313 09:33:11.949668 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc104487-8dce-43e7-8673-1e9d6a8f5704" containerName="heat-cfnapi" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.949674 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc104487-8dce-43e7-8673-1e9d6a8f5704" containerName="heat-cfnapi" Mar 13 09:33:11 crc kubenswrapper[4841]: E0313 09:33:11.949682 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e96c8f-e077-499b-9b31-3984ad159364" containerName="mariadb-database-create" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.949688 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e96c8f-e077-499b-9b31-3984ad159364" containerName="mariadb-database-create" Mar 13 09:33:11 crc kubenswrapper[4841]: E0313 09:33:11.949706 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10eed5d3-9fc4-4322-b9ef-c299a32a94bb" containerName="mariadb-database-create" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.949713 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="10eed5d3-9fc4-4322-b9ef-c299a32a94bb" containerName="mariadb-database-create" Mar 13 09:33:11 crc kubenswrapper[4841]: E0313 09:33:11.949731 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ea1b844-20ae-4d19-8df2-3d472391080a" containerName="mariadb-account-create-update" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.949737 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea1b844-20ae-4d19-8df2-3d472391080a" containerName="mariadb-account-create-update" Mar 13 09:33:11 crc kubenswrapper[4841]: E0313 09:33:11.949746 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9980f7-5596-4002-b087-04cb53c6a78a" containerName="mariadb-database-create" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.949751 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9980f7-5596-4002-b087-04cb53c6a78a" containerName="mariadb-database-create" Mar 13 09:33:11 crc kubenswrapper[4841]: E0313 09:33:11.949761 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb74819f-9ae9-498b-88b5-f0fcaf598409" containerName="placement-api" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.949767 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb74819f-9ae9-498b-88b5-f0fcaf598409" containerName="placement-api" Mar 13 09:33:11 crc kubenswrapper[4841]: E0313 09:33:11.949784 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d1db8e-6619-436b-b998-cea606b30b5c" containerName="mariadb-account-create-update" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.949790 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d1db8e-6619-436b-b998-cea606b30b5c" containerName="mariadb-account-create-update" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.949948 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc104487-8dce-43e7-8673-1e9d6a8f5704" containerName="heat-cfnapi" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.949957 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="795b0aeb-4a18-40aa-88b9-9e8b9130e0a6" containerName="heat-api" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.949966 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf0add86-8160-48dd-a9b5-bef3edcf4810" containerName="mariadb-account-create-update" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.949980 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="10eed5d3-9fc4-4322-b9ef-c299a32a94bb" containerName="mariadb-database-create" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.949990 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb74819f-9ae9-498b-88b5-f0fcaf598409" containerName="placement-log" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.950002 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d1db8e-6619-436b-b998-cea606b30b5c" containerName="mariadb-account-create-update" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.950009 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e96c8f-e077-499b-9b31-3984ad159364" containerName="mariadb-database-create" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.950019 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f9980f7-5596-4002-b087-04cb53c6a78a" containerName="mariadb-database-create" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.950033 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ea1b844-20ae-4d19-8df2-3d472391080a" containerName="mariadb-account-create-update" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.950045 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb74819f-9ae9-498b-88b5-f0fcaf598409" containerName="placement-api" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.950597 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-24ghd" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.952549 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2zgvk" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.952558 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.952583 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 09:33:11 crc kubenswrapper[4841]: I0313 09:33:11.962686 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-24ghd"] Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.005755 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb74819f-9ae9-498b-88b5-f0fcaf598409" path="/var/lib/kubelet/pods/eb74819f-9ae9-498b-88b5-f0fcaf598409/volumes" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.036286 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7def500c-116f-47bb-b58d-23e07d7a0771-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-24ghd\" (UID: \"7def500c-116f-47bb-b58d-23e07d7a0771\") " pod="openstack/nova-cell0-conductor-db-sync-24ghd" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.036464 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7def500c-116f-47bb-b58d-23e07d7a0771-config-data\") pod \"nova-cell0-conductor-db-sync-24ghd\" (UID: \"7def500c-116f-47bb-b58d-23e07d7a0771\") " pod="openstack/nova-cell0-conductor-db-sync-24ghd" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.036496 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkb7g\" (UniqueName: \"kubernetes.io/projected/7def500c-116f-47bb-b58d-23e07d7a0771-kube-api-access-rkb7g\") pod \"nova-cell0-conductor-db-sync-24ghd\" (UID: \"7def500c-116f-47bb-b58d-23e07d7a0771\") " pod="openstack/nova-cell0-conductor-db-sync-24ghd" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.036627 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7def500c-116f-47bb-b58d-23e07d7a0771-scripts\") pod \"nova-cell0-conductor-db-sync-24ghd\" (UID: \"7def500c-116f-47bb-b58d-23e07d7a0771\") " pod="openstack/nova-cell0-conductor-db-sync-24ghd" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.111443 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.111803 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.138554 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7def500c-116f-47bb-b58d-23e07d7a0771-config-data\") pod \"nova-cell0-conductor-db-sync-24ghd\" (UID: \"7def500c-116f-47bb-b58d-23e07d7a0771\") " pod="openstack/nova-cell0-conductor-db-sync-24ghd" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.138599 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkb7g\" (UniqueName: \"kubernetes.io/projected/7def500c-116f-47bb-b58d-23e07d7a0771-kube-api-access-rkb7g\") pod \"nova-cell0-conductor-db-sync-24ghd\" (UID: \"7def500c-116f-47bb-b58d-23e07d7a0771\") " pod="openstack/nova-cell0-conductor-db-sync-24ghd" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.138681 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7def500c-116f-47bb-b58d-23e07d7a0771-scripts\") pod \"nova-cell0-conductor-db-sync-24ghd\" (UID: \"7def500c-116f-47bb-b58d-23e07d7a0771\") " pod="openstack/nova-cell0-conductor-db-sync-24ghd" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.138727 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7def500c-116f-47bb-b58d-23e07d7a0771-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-24ghd\" (UID: \"7def500c-116f-47bb-b58d-23e07d7a0771\") " pod="openstack/nova-cell0-conductor-db-sync-24ghd" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.145893 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7def500c-116f-47bb-b58d-23e07d7a0771-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-24ghd\" (UID: \"7def500c-116f-47bb-b58d-23e07d7a0771\") " pod="openstack/nova-cell0-conductor-db-sync-24ghd" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.148625 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7def500c-116f-47bb-b58d-23e07d7a0771-scripts\") pod \"nova-cell0-conductor-db-sync-24ghd\" (UID: \"7def500c-116f-47bb-b58d-23e07d7a0771\") " pod="openstack/nova-cell0-conductor-db-sync-24ghd" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.151695 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.159225 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7def500c-116f-47bb-b58d-23e07d7a0771-config-data\") pod \"nova-cell0-conductor-db-sync-24ghd\" (UID: \"7def500c-116f-47bb-b58d-23e07d7a0771\") " pod="openstack/nova-cell0-conductor-db-sync-24ghd" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.162303 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkb7g\" (UniqueName: \"kubernetes.io/projected/7def500c-116f-47bb-b58d-23e07d7a0771-kube-api-access-rkb7g\") pod \"nova-cell0-conductor-db-sync-24ghd\" (UID: \"7def500c-116f-47bb-b58d-23e07d7a0771\") " pod="openstack/nova-cell0-conductor-db-sync-24ghd" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.186145 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.270298 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-24ghd" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.496962 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.497439 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 09:33:12 crc kubenswrapper[4841]: I0313 09:33:12.804858 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-24ghd"] Mar 13 09:33:13 crc kubenswrapper[4841]: I0313 09:33:13.526441 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-24ghd" event={"ID":"7def500c-116f-47bb-b58d-23e07d7a0771","Type":"ContainerStarted","Data":"3332fcf3631eebb4f0af87e9101e51d14c913bcd03c0415ed555d3633b61a6e5"} Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.546034 4841 generic.go:334] "Generic (PLEG): container finished" podID="094f4da7-e278-4c06-aad4-b10985b42c76" containerID="3870f0d0fbd4da486d2eca09f97ce8e15e2c5081bc7472f9433f445b6a4b9136" exitCode=0 Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.546643 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"094f4da7-e278-4c06-aad4-b10985b42c76","Type":"ContainerDied","Data":"3870f0d0fbd4da486d2eca09f97ce8e15e2c5081bc7472f9433f445b6a4b9136"} Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.687401 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.717517 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.717617 4841 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.720625 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.804984 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-combined-ca-bundle\") pod \"094f4da7-e278-4c06-aad4-b10985b42c76\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.805117 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/094f4da7-e278-4c06-aad4-b10985b42c76-log-httpd\") pod \"094f4da7-e278-4c06-aad4-b10985b42c76\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.805182 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-sg-core-conf-yaml\") pod \"094f4da7-e278-4c06-aad4-b10985b42c76\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.805282 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k5z7\" (UniqueName: \"kubernetes.io/projected/094f4da7-e278-4c06-aad4-b10985b42c76-kube-api-access-2k5z7\") pod \"094f4da7-e278-4c06-aad4-b10985b42c76\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.805329 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-config-data\") pod \"094f4da7-e278-4c06-aad4-b10985b42c76\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.805403 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-scripts\") pod \"094f4da7-e278-4c06-aad4-b10985b42c76\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.805451 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/094f4da7-e278-4c06-aad4-b10985b42c76-run-httpd\") pod \"094f4da7-e278-4c06-aad4-b10985b42c76\" (UID: \"094f4da7-e278-4c06-aad4-b10985b42c76\") " Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.811806 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/094f4da7-e278-4c06-aad4-b10985b42c76-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "094f4da7-e278-4c06-aad4-b10985b42c76" (UID: "094f4da7-e278-4c06-aad4-b10985b42c76"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.814477 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-scripts" (OuterVolumeSpecName: "scripts") pod "094f4da7-e278-4c06-aad4-b10985b42c76" (UID: "094f4da7-e278-4c06-aad4-b10985b42c76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.815653 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/094f4da7-e278-4c06-aad4-b10985b42c76-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "094f4da7-e278-4c06-aad4-b10985b42c76" (UID: "094f4da7-e278-4c06-aad4-b10985b42c76"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.852419 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/094f4da7-e278-4c06-aad4-b10985b42c76-kube-api-access-2k5z7" (OuterVolumeSpecName: "kube-api-access-2k5z7") pod "094f4da7-e278-4c06-aad4-b10985b42c76" (UID: "094f4da7-e278-4c06-aad4-b10985b42c76"). InnerVolumeSpecName "kube-api-access-2k5z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.860229 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "094f4da7-e278-4c06-aad4-b10985b42c76" (UID: "094f4da7-e278-4c06-aad4-b10985b42c76"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.908398 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/094f4da7-e278-4c06-aad4-b10985b42c76-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.908426 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/094f4da7-e278-4c06-aad4-b10985b42c76-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.908435 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.908446 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k5z7\" (UniqueName: \"kubernetes.io/projected/094f4da7-e278-4c06-aad4-b10985b42c76-kube-api-access-2k5z7\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.908454 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.949286 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-config-data" (OuterVolumeSpecName: "config-data") pod "094f4da7-e278-4c06-aad4-b10985b42c76" (UID: "094f4da7-e278-4c06-aad4-b10985b42c76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:14 crc kubenswrapper[4841]: I0313 09:33:14.951437 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "094f4da7-e278-4c06-aad4-b10985b42c76" (UID: "094f4da7-e278-4c06-aad4-b10985b42c76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.010239 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.010297 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/094f4da7-e278-4c06-aad4-b10985b42c76-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:15 crc kubenswrapper[4841]: E0313 09:33:15.390275 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 09:33:15 crc kubenswrapper[4841]: E0313 09:33:15.400454 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 09:33:15 crc kubenswrapper[4841]: E0313 09:33:15.402822 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 09:33:15 crc kubenswrapper[4841]: E0313 09:33:15.402888 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-76777bf9d9-sc2jp" podUID="9d53e19b-a3e7-40de-a06b-d6e1c0def922" containerName="heat-engine" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.564326 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"094f4da7-e278-4c06-aad4-b10985b42c76","Type":"ContainerDied","Data":"3b7bf691a0c944ecd6b7a2df72cc597f1b9e22c15d1e947644de7c80aab7f1e8"} Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.564397 4841 scope.go:117] "RemoveContainer" containerID="45370872ad33ca0871dcec57bc318277fc03a75bef895a1c17202a1cd11caf5d" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.564403 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.606059 4841 scope.go:117] "RemoveContainer" containerID="64a1276e570103a751e45265ef7aef31f82d447254bed0a15bae954729ff33a1" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.628402 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.640028 4841 scope.go:117] "RemoveContainer" containerID="ca0d3b0eda8cd3579598eb8da552e2b0a5b77b4e2557c72bef7cb07b7f0cea19" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.654384 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.696189 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:33:15 crc kubenswrapper[4841]: E0313 09:33:15.701788 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="094f4da7-e278-4c06-aad4-b10985b42c76" containerName="ceilometer-central-agent" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.701844 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="094f4da7-e278-4c06-aad4-b10985b42c76" containerName="ceilometer-central-agent" Mar 13 09:33:15 crc kubenswrapper[4841]: E0313 09:33:15.701873 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="094f4da7-e278-4c06-aad4-b10985b42c76" containerName="proxy-httpd" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.701886 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="094f4da7-e278-4c06-aad4-b10985b42c76" containerName="proxy-httpd" Mar 13 09:33:15 crc kubenswrapper[4841]: E0313 09:33:15.701906 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="094f4da7-e278-4c06-aad4-b10985b42c76" containerName="ceilometer-notification-agent" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.701914 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="094f4da7-e278-4c06-aad4-b10985b42c76" containerName="ceilometer-notification-agent" Mar 13 09:33:15 crc kubenswrapper[4841]: E0313 09:33:15.701955 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795b0aeb-4a18-40aa-88b9-9e8b9130e0a6" containerName="heat-api" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.701963 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="795b0aeb-4a18-40aa-88b9-9e8b9130e0a6" containerName="heat-api" Mar 13 09:33:15 crc kubenswrapper[4841]: E0313 09:33:15.701980 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="094f4da7-e278-4c06-aad4-b10985b42c76" containerName="sg-core" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.701991 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="094f4da7-e278-4c06-aad4-b10985b42c76" containerName="sg-core" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.702648 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="094f4da7-e278-4c06-aad4-b10985b42c76" containerName="ceilometer-notification-agent" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.702669 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="094f4da7-e278-4c06-aad4-b10985b42c76" containerName="ceilometer-central-agent" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.702688 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc104487-8dce-43e7-8673-1e9d6a8f5704" containerName="heat-cfnapi" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.702705 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="795b0aeb-4a18-40aa-88b9-9e8b9130e0a6" containerName="heat-api" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.702735 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="094f4da7-e278-4c06-aad4-b10985b42c76" containerName="proxy-httpd" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.702745 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="094f4da7-e278-4c06-aad4-b10985b42c76" containerName="sg-core" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.714485 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.717450 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.717676 4841 scope.go:117] "RemoveContainer" containerID="3870f0d0fbd4da486d2eca09f97ce8e15e2c5081bc7472f9433f445b6a4b9136" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.720117 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.760943 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.830723 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-run-httpd\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.830842 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-log-httpd\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.830883 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.830908 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-config-data\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.831051 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.831151 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnvbn\" (UniqueName: \"kubernetes.io/projected/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-kube-api-access-cnvbn\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.831219 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-scripts\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.933225 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-run-httpd\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.933352 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-log-httpd\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.933391 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.933412 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-config-data\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.933450 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.933487 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnvbn\" (UniqueName: \"kubernetes.io/projected/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-kube-api-access-cnvbn\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.933521 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-scripts\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.933867 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-run-httpd\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.935185 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-log-httpd\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.938736 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.939439 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-config-data\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.946094 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-scripts\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.961052 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:15 crc kubenswrapper[4841]: I0313 09:33:15.962298 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnvbn\" (UniqueName: \"kubernetes.io/projected/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-kube-api-access-cnvbn\") pod \"ceilometer-0\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " pod="openstack/ceilometer-0" Mar 13 09:33:16 crc kubenswrapper[4841]: I0313 09:33:16.011184 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="094f4da7-e278-4c06-aad4-b10985b42c76" path="/var/lib/kubelet/pods/094f4da7-e278-4c06-aad4-b10985b42c76/volumes" Mar 13 09:33:16 crc kubenswrapper[4841]: I0313 09:33:16.059779 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:33:16 crc kubenswrapper[4841]: I0313 09:33:16.598812 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:33:17 crc kubenswrapper[4841]: I0313 09:33:17.593665 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13","Type":"ContainerStarted","Data":"03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce"} Mar 13 09:33:17 crc kubenswrapper[4841]: I0313 09:33:17.594366 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13","Type":"ContainerStarted","Data":"8d117ab69e76f1a0ce536638b9f1b2bf65e306b4df3d3ce7f2c26db38edd92a8"} Mar 13 09:33:23 crc kubenswrapper[4841]: I0313 09:33:23.681406 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13","Type":"ContainerStarted","Data":"95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66"} Mar 13 09:33:23 crc kubenswrapper[4841]: I0313 09:33:23.681852 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13","Type":"ContainerStarted","Data":"427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3"} Mar 13 09:33:23 crc kubenswrapper[4841]: I0313 09:33:23.683473 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-24ghd" event={"ID":"7def500c-116f-47bb-b58d-23e07d7a0771","Type":"ContainerStarted","Data":"5eda587d3f4512efa865576c43920f269a3499d1015651042bb6fbd1ac0e9406"} Mar 13 09:33:23 crc kubenswrapper[4841]: I0313 09:33:23.703192 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-24ghd" podStartSLOduration=2.887106127 podStartE2EDuration="12.703168483s" podCreationTimestamp="2026-03-13 09:33:11 +0000 UTC" firstStartedPulling="2026-03-13 09:33:12.8109 +0000 UTC m=+1275.540800191" lastFinishedPulling="2026-03-13 09:33:22.626962356 +0000 UTC m=+1285.356862547" observedRunningTime="2026-03-13 09:33:23.697403053 +0000 UTC m=+1286.427303244" watchObservedRunningTime="2026-03-13 09:33:23.703168483 +0000 UTC m=+1286.433068674" Mar 13 09:33:25 crc kubenswrapper[4841]: E0313 09:33:25.395802 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 09:33:25 crc kubenswrapper[4841]: E0313 09:33:25.397868 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 09:33:25 crc kubenswrapper[4841]: E0313 09:33:25.403317 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 09:33:25 crc kubenswrapper[4841]: E0313 09:33:25.403347 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-76777bf9d9-sc2jp" podUID="9d53e19b-a3e7-40de-a06b-d6e1c0def922" containerName="heat-engine" Mar 13 09:33:25 crc kubenswrapper[4841]: I0313 09:33:25.703290 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13","Type":"ContainerStarted","Data":"edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472"} Mar 13 09:33:25 crc kubenswrapper[4841]: I0313 09:33:25.703537 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 09:33:25 crc kubenswrapper[4841]: I0313 09:33:25.730147 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.200903202 podStartE2EDuration="10.730123235s" podCreationTimestamp="2026-03-13 09:33:15 +0000 UTC" firstStartedPulling="2026-03-13 09:33:16.612141062 +0000 UTC m=+1279.342041253" lastFinishedPulling="2026-03-13 09:33:25.141361105 +0000 UTC m=+1287.871261286" observedRunningTime="2026-03-13 09:33:25.725598443 +0000 UTC m=+1288.455498644" watchObservedRunningTime="2026-03-13 09:33:25.730123235 +0000 UTC m=+1288.460023426" Mar 13 09:33:26 crc kubenswrapper[4841]: E0313 09:33:26.025480 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d53e19b_a3e7_40de_a06b_d6e1c0def922.slice/crio-conmon-ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f.scope\": RecentStats: unable to find data in memory cache]" Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.050419 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.448208 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76777bf9d9-sc2jp" Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.553970 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5s6g\" (UniqueName: \"kubernetes.io/projected/9d53e19b-a3e7-40de-a06b-d6e1c0def922-kube-api-access-l5s6g\") pod \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\" (UID: \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\") " Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.554040 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d53e19b-a3e7-40de-a06b-d6e1c0def922-combined-ca-bundle\") pod \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\" (UID: \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\") " Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.554128 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d53e19b-a3e7-40de-a06b-d6e1c0def922-config-data-custom\") pod \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\" (UID: \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\") " Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.554286 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d53e19b-a3e7-40de-a06b-d6e1c0def922-config-data\") pod \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\" (UID: \"9d53e19b-a3e7-40de-a06b-d6e1c0def922\") " Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.559945 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d53e19b-a3e7-40de-a06b-d6e1c0def922-kube-api-access-l5s6g" (OuterVolumeSpecName: "kube-api-access-l5s6g") pod "9d53e19b-a3e7-40de-a06b-d6e1c0def922" (UID: "9d53e19b-a3e7-40de-a06b-d6e1c0def922"). InnerVolumeSpecName "kube-api-access-l5s6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.575495 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d53e19b-a3e7-40de-a06b-d6e1c0def922-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9d53e19b-a3e7-40de-a06b-d6e1c0def922" (UID: "9d53e19b-a3e7-40de-a06b-d6e1c0def922"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.597918 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d53e19b-a3e7-40de-a06b-d6e1c0def922-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d53e19b-a3e7-40de-a06b-d6e1c0def922" (UID: "9d53e19b-a3e7-40de-a06b-d6e1c0def922"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.623436 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d53e19b-a3e7-40de-a06b-d6e1c0def922-config-data" (OuterVolumeSpecName: "config-data") pod "9d53e19b-a3e7-40de-a06b-d6e1c0def922" (UID: "9d53e19b-a3e7-40de-a06b-d6e1c0def922"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.656335 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d53e19b-a3e7-40de-a06b-d6e1c0def922-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.656371 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5s6g\" (UniqueName: \"kubernetes.io/projected/9d53e19b-a3e7-40de-a06b-d6e1c0def922-kube-api-access-l5s6g\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.656392 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d53e19b-a3e7-40de-a06b-d6e1c0def922-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.656403 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d53e19b-a3e7-40de-a06b-d6e1c0def922-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.713771 4841 generic.go:334] "Generic (PLEG): container finished" podID="9d53e19b-a3e7-40de-a06b-d6e1c0def922" containerID="ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f" exitCode=0 Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.714614 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76777bf9d9-sc2jp" Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.718431 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-76777bf9d9-sc2jp" event={"ID":"9d53e19b-a3e7-40de-a06b-d6e1c0def922","Type":"ContainerDied","Data":"ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f"} Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.718508 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-76777bf9d9-sc2jp" event={"ID":"9d53e19b-a3e7-40de-a06b-d6e1c0def922","Type":"ContainerDied","Data":"c6c9bca2ea71951b782faa8ba13bed7310745f9d9517743884778cf8816624f4"} Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.718535 4841 scope.go:117] "RemoveContainer" containerID="ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f" Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.749652 4841 scope.go:117] "RemoveContainer" containerID="ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f" Mar 13 09:33:26 crc kubenswrapper[4841]: E0313 09:33:26.749998 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f\": container with ID starting with ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f not found: ID does not exist" containerID="ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f" Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.750036 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f"} err="failed to get container status \"ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f\": rpc error: code = NotFound desc = could not find container \"ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f\": container with ID starting with ba488eb98af2da69fb67e49f29400ebf29148da2dc0062b02f61f3a0040bf29f not found: ID does not exist" Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.756428 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-76777bf9d9-sc2jp"] Mar 13 09:33:26 crc kubenswrapper[4841]: I0313 09:33:26.767742 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-76777bf9d9-sc2jp"] Mar 13 09:33:27 crc kubenswrapper[4841]: I0313 09:33:27.728218 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerName="ceilometer-central-agent" containerID="cri-o://03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce" gracePeriod=30 Mar 13 09:33:27 crc kubenswrapper[4841]: I0313 09:33:27.728324 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerName="ceilometer-notification-agent" containerID="cri-o://427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3" gracePeriod=30 Mar 13 09:33:27 crc kubenswrapper[4841]: I0313 09:33:27.728648 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerName="proxy-httpd" containerID="cri-o://edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472" gracePeriod=30 Mar 13 09:33:27 crc kubenswrapper[4841]: I0313 09:33:27.731105 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerName="sg-core" containerID="cri-o://95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66" gracePeriod=30 Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.008941 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d53e19b-a3e7-40de-a06b-d6e1c0def922" path="/var/lib/kubelet/pods/9d53e19b-a3e7-40de-a06b-d6e1c0def922/volumes" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.547142 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.589479 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-log-httpd\") pod \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.589865 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-config-data\") pod \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.589965 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" (UID: "e6b9186d-c2c7-47f7-93f2-fbc943e2fe13"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.589988 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-scripts\") pod \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.590215 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-run-httpd\") pod \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.590381 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-combined-ca-bundle\") pod \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.590481 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" (UID: "e6b9186d-c2c7-47f7-93f2-fbc943e2fe13"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.590585 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnvbn\" (UniqueName: \"kubernetes.io/projected/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-kube-api-access-cnvbn\") pod \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.590743 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-sg-core-conf-yaml\") pod \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\" (UID: \"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13\") " Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.591312 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.591419 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.594858 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-scripts" (OuterVolumeSpecName: "scripts") pod "e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" (UID: "e6b9186d-c2c7-47f7-93f2-fbc943e2fe13"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.598492 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-kube-api-access-cnvbn" (OuterVolumeSpecName: "kube-api-access-cnvbn") pod "e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" (UID: "e6b9186d-c2c7-47f7-93f2-fbc943e2fe13"). InnerVolumeSpecName "kube-api-access-cnvbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.628931 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" (UID: "e6b9186d-c2c7-47f7-93f2-fbc943e2fe13"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.682450 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" (UID: "e6b9186d-c2c7-47f7-93f2-fbc943e2fe13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.693047 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.693077 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnvbn\" (UniqueName: \"kubernetes.io/projected/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-kube-api-access-cnvbn\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.693087 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.693097 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.707197 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-config-data" (OuterVolumeSpecName: "config-data") pod "e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" (UID: "e6b9186d-c2c7-47f7-93f2-fbc943e2fe13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.741418 4841 generic.go:334] "Generic (PLEG): container finished" podID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerID="edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472" exitCode=0 Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.741449 4841 generic.go:334] "Generic (PLEG): container finished" podID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerID="95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66" exitCode=2 Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.741460 4841 generic.go:334] "Generic (PLEG): container finished" podID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerID="427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3" exitCode=0 Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.741469 4841 generic.go:334] "Generic (PLEG): container finished" podID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerID="03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce" exitCode=0 Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.741477 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13","Type":"ContainerDied","Data":"edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472"} Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.741541 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13","Type":"ContainerDied","Data":"95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66"} Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.741552 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13","Type":"ContainerDied","Data":"427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3"} Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.741562 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13","Type":"ContainerDied","Data":"03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce"} Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.741573 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e6b9186d-c2c7-47f7-93f2-fbc943e2fe13","Type":"ContainerDied","Data":"8d117ab69e76f1a0ce536638b9f1b2bf65e306b4df3d3ce7f2c26db38edd92a8"} Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.741589 4841 scope.go:117] "RemoveContainer" containerID="edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.742649 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.761008 4841 scope.go:117] "RemoveContainer" containerID="95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.778751 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.787258 4841 scope.go:117] "RemoveContainer" containerID="427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.787806 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.798594 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.804840 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:33:28 crc kubenswrapper[4841]: E0313 09:33:28.805164 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d53e19b-a3e7-40de-a06b-d6e1c0def922" containerName="heat-engine" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.805178 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d53e19b-a3e7-40de-a06b-d6e1c0def922" containerName="heat-engine" Mar 13 09:33:28 crc kubenswrapper[4841]: E0313 09:33:28.805202 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerName="proxy-httpd" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.805209 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerName="proxy-httpd" Mar 13 09:33:28 crc kubenswrapper[4841]: E0313 09:33:28.805221 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerName="sg-core" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.805227 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerName="sg-core" Mar 13 09:33:28 crc kubenswrapper[4841]: E0313 09:33:28.805240 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerName="ceilometer-central-agent" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.805246 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerName="ceilometer-central-agent" Mar 13 09:33:28 crc kubenswrapper[4841]: E0313 09:33:28.805291 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerName="ceilometer-notification-agent" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.805298 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerName="ceilometer-notification-agent" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.805459 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerName="ceilometer-notification-agent" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.805476 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerName="proxy-httpd" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.805487 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerName="ceilometer-central-agent" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.805498 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d53e19b-a3e7-40de-a06b-d6e1c0def922" containerName="heat-engine" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.805509 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" containerName="sg-core" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.807005 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.809120 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.809329 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.813938 4841 scope.go:117] "RemoveContainer" containerID="03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.823202 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.858127 4841 scope.go:117] "RemoveContainer" containerID="edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472" Mar 13 09:33:28 crc kubenswrapper[4841]: E0313 09:33:28.858775 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472\": container with ID starting with edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472 not found: ID does not exist" containerID="edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.858812 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472"} err="failed to get container status \"edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472\": rpc error: code = NotFound desc = could not find container \"edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472\": container with ID starting with edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472 not found: ID does not exist" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.858857 4841 scope.go:117] "RemoveContainer" containerID="95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66" Mar 13 09:33:28 crc kubenswrapper[4841]: E0313 09:33:28.859140 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66\": container with ID starting with 95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66 not found: ID does not exist" containerID="95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.859189 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66"} err="failed to get container status \"95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66\": rpc error: code = NotFound desc = could not find container \"95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66\": container with ID starting with 95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66 not found: ID does not exist" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.859207 4841 scope.go:117] "RemoveContainer" containerID="427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3" Mar 13 09:33:28 crc kubenswrapper[4841]: E0313 09:33:28.859602 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3\": container with ID starting with 427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3 not found: ID does not exist" containerID="427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.859632 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3"} err="failed to get container status \"427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3\": rpc error: code = NotFound desc = could not find container \"427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3\": container with ID starting with 427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3 not found: ID does not exist" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.859672 4841 scope.go:117] "RemoveContainer" containerID="03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce" Mar 13 09:33:28 crc kubenswrapper[4841]: E0313 09:33:28.860593 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce\": container with ID starting with 03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce not found: ID does not exist" containerID="03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.860643 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce"} err="failed to get container status \"03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce\": rpc error: code = NotFound desc = could not find container \"03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce\": container with ID starting with 03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce not found: ID does not exist" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.860692 4841 scope.go:117] "RemoveContainer" containerID="edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.860947 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472"} err="failed to get container status \"edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472\": rpc error: code = NotFound desc = could not find container \"edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472\": container with ID starting with edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472 not found: ID does not exist" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.860968 4841 scope.go:117] "RemoveContainer" containerID="95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.861289 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66"} err="failed to get container status \"95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66\": rpc error: code = NotFound desc = could not find container \"95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66\": container with ID starting with 95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66 not found: ID does not exist" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.861310 4841 scope.go:117] "RemoveContainer" containerID="427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.861580 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3"} err="failed to get container status \"427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3\": rpc error: code = NotFound desc = could not find container \"427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3\": container with ID starting with 427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3 not found: ID does not exist" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.861613 4841 scope.go:117] "RemoveContainer" containerID="03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.861830 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce"} err="failed to get container status \"03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce\": rpc error: code = NotFound desc = could not find container \"03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce\": container with ID starting with 03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce not found: ID does not exist" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.861871 4841 scope.go:117] "RemoveContainer" containerID="edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.862094 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472"} err="failed to get container status \"edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472\": rpc error: code = NotFound desc = could not find container \"edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472\": container with ID starting with edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472 not found: ID does not exist" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.862131 4841 scope.go:117] "RemoveContainer" containerID="95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.862334 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66"} err="failed to get container status \"95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66\": rpc error: code = NotFound desc = could not find container \"95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66\": container with ID starting with 95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66 not found: ID does not exist" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.862371 4841 scope.go:117] "RemoveContainer" containerID="427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.862834 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3"} err="failed to get container status \"427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3\": rpc error: code = NotFound desc = could not find container \"427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3\": container with ID starting with 427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3 not found: ID does not exist" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.862874 4841 scope.go:117] "RemoveContainer" containerID="03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.863106 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce"} err="failed to get container status \"03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce\": rpc error: code = NotFound desc = could not find container \"03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce\": container with ID starting with 03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce not found: ID does not exist" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.863130 4841 scope.go:117] "RemoveContainer" containerID="edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.863659 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472"} err="failed to get container status \"edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472\": rpc error: code = NotFound desc = could not find container \"edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472\": container with ID starting with edb93b81d4f500c86d9e74ca2bc01e86be65bd53e0a12027d467a4d4cbaa5472 not found: ID does not exist" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.863680 4841 scope.go:117] "RemoveContainer" containerID="95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.863993 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66"} err="failed to get container status \"95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66\": rpc error: code = NotFound desc = could not find container \"95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66\": container with ID starting with 95b64439db9a0f10641ec643bd4435217aa04e7ec294fac52daf5887ddc02b66 not found: ID does not exist" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.864013 4841 scope.go:117] "RemoveContainer" containerID="427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.864325 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3"} err="failed to get container status \"427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3\": rpc error: code = NotFound desc = could not find container \"427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3\": container with ID starting with 427bd7572464c89939d653d992208bb7a3d80f729a8fc0f3ba01083567e481c3 not found: ID does not exist" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.864359 4841 scope.go:117] "RemoveContainer" containerID="03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.864553 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce"} err="failed to get container status \"03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce\": rpc error: code = NotFound desc = could not find container \"03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce\": container with ID starting with 03fcf7b8009e148aafa38993c2502b1082e5f89f044d51ed52aa2bbc60584fce not found: ID does not exist" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.900557 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpjlr\" (UniqueName: \"kubernetes.io/projected/69841b17-abb2-40af-9923-d61b4818c475-kube-api-access-rpjlr\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.900614 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.900702 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-config-data\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.900730 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.900812 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69841b17-abb2-40af-9923-d61b4818c475-run-httpd\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.900844 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-scripts\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:28 crc kubenswrapper[4841]: I0313 09:33:28.900870 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69841b17-abb2-40af-9923-d61b4818c475-log-httpd\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:29 crc kubenswrapper[4841]: I0313 09:33:29.002621 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpjlr\" (UniqueName: \"kubernetes.io/projected/69841b17-abb2-40af-9923-d61b4818c475-kube-api-access-rpjlr\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:29 crc kubenswrapper[4841]: I0313 09:33:29.002681 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:29 crc kubenswrapper[4841]: I0313 09:33:29.002801 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-config-data\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:29 crc kubenswrapper[4841]: I0313 09:33:29.002838 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:29 crc kubenswrapper[4841]: I0313 09:33:29.002937 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69841b17-abb2-40af-9923-d61b4818c475-run-httpd\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:29 crc kubenswrapper[4841]: I0313 09:33:29.002973 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-scripts\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:29 crc kubenswrapper[4841]: I0313 09:33:29.003002 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69841b17-abb2-40af-9923-d61b4818c475-log-httpd\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:29 crc kubenswrapper[4841]: I0313 09:33:29.003622 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69841b17-abb2-40af-9923-d61b4818c475-log-httpd\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:29 crc kubenswrapper[4841]: I0313 09:33:29.003856 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69841b17-abb2-40af-9923-d61b4818c475-run-httpd\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:29 crc kubenswrapper[4841]: I0313 09:33:29.011208 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:29 crc kubenswrapper[4841]: I0313 09:33:29.011803 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-config-data\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:29 crc kubenswrapper[4841]: I0313 09:33:29.012425 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-scripts\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:29 crc kubenswrapper[4841]: I0313 09:33:29.014000 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:29 crc kubenswrapper[4841]: I0313 09:33:29.031193 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpjlr\" (UniqueName: \"kubernetes.io/projected/69841b17-abb2-40af-9923-d61b4818c475-kube-api-access-rpjlr\") pod \"ceilometer-0\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " pod="openstack/ceilometer-0" Mar 13 09:33:29 crc kubenswrapper[4841]: I0313 09:33:29.132506 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:33:29 crc kubenswrapper[4841]: I0313 09:33:29.642318 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:33:29 crc kubenswrapper[4841]: I0313 09:33:29.752197 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69841b17-abb2-40af-9923-d61b4818c475","Type":"ContainerStarted","Data":"fe4ac005638103613f45ecd2f87d483ba9c8073eb91de4b7ea01f362b13a0de2"} Mar 13 09:33:30 crc kubenswrapper[4841]: I0313 09:33:30.006993 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b9186d-c2c7-47f7-93f2-fbc943e2fe13" path="/var/lib/kubelet/pods/e6b9186d-c2c7-47f7-93f2-fbc943e2fe13/volumes" Mar 13 09:33:30 crc kubenswrapper[4841]: I0313 09:33:30.762076 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69841b17-abb2-40af-9923-d61b4818c475","Type":"ContainerStarted","Data":"2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4"} Mar 13 09:33:31 crc kubenswrapper[4841]: I0313 09:33:31.775767 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69841b17-abb2-40af-9923-d61b4818c475","Type":"ContainerStarted","Data":"f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4"} Mar 13 09:33:32 crc kubenswrapper[4841]: I0313 09:33:32.796597 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69841b17-abb2-40af-9923-d61b4818c475","Type":"ContainerStarted","Data":"98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b"} Mar 13 09:33:34 crc kubenswrapper[4841]: I0313 09:33:34.815249 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69841b17-abb2-40af-9923-d61b4818c475","Type":"ContainerStarted","Data":"9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311"} Mar 13 09:33:34 crc kubenswrapper[4841]: I0313 09:33:34.815705 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 09:33:34 crc kubenswrapper[4841]: I0313 09:33:34.840484 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.112632063 podStartE2EDuration="6.840458963s" podCreationTimestamp="2026-03-13 09:33:28 +0000 UTC" firstStartedPulling="2026-03-13 09:33:29.640881533 +0000 UTC m=+1292.370781724" lastFinishedPulling="2026-03-13 09:33:34.368708433 +0000 UTC m=+1297.098608624" observedRunningTime="2026-03-13 09:33:34.833569307 +0000 UTC m=+1297.563469508" watchObservedRunningTime="2026-03-13 09:33:34.840458963 +0000 UTC m=+1297.570359164" Mar 13 09:33:35 crc kubenswrapper[4841]: I0313 09:33:35.494797 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:33:35 crc kubenswrapper[4841]: I0313 09:33:35.831816 4841 generic.go:334] "Generic (PLEG): container finished" podID="7def500c-116f-47bb-b58d-23e07d7a0771" containerID="5eda587d3f4512efa865576c43920f269a3499d1015651042bb6fbd1ac0e9406" exitCode=0 Mar 13 09:33:35 crc kubenswrapper[4841]: I0313 09:33:35.833014 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-24ghd" event={"ID":"7def500c-116f-47bb-b58d-23e07d7a0771","Type":"ContainerDied","Data":"5eda587d3f4512efa865576c43920f269a3499d1015651042bb6fbd1ac0e9406"} Mar 13 09:33:36 crc kubenswrapper[4841]: I0313 09:33:36.855228 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69841b17-abb2-40af-9923-d61b4818c475" containerName="ceilometer-central-agent" containerID="cri-o://2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4" gracePeriod=30 Mar 13 09:33:36 crc kubenswrapper[4841]: I0313 09:33:36.856047 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69841b17-abb2-40af-9923-d61b4818c475" containerName="proxy-httpd" containerID="cri-o://9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311" gracePeriod=30 Mar 13 09:33:36 crc kubenswrapper[4841]: I0313 09:33:36.856348 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69841b17-abb2-40af-9923-d61b4818c475" containerName="ceilometer-notification-agent" containerID="cri-o://f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4" gracePeriod=30 Mar 13 09:33:36 crc kubenswrapper[4841]: I0313 09:33:36.856492 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69841b17-abb2-40af-9923-d61b4818c475" containerName="sg-core" containerID="cri-o://98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b" gracePeriod=30 Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.323457 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-24ghd" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.485691 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7def500c-116f-47bb-b58d-23e07d7a0771-combined-ca-bundle\") pod \"7def500c-116f-47bb-b58d-23e07d7a0771\" (UID: \"7def500c-116f-47bb-b58d-23e07d7a0771\") " Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.485876 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7def500c-116f-47bb-b58d-23e07d7a0771-config-data\") pod \"7def500c-116f-47bb-b58d-23e07d7a0771\" (UID: \"7def500c-116f-47bb-b58d-23e07d7a0771\") " Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.485903 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkb7g\" (UniqueName: \"kubernetes.io/projected/7def500c-116f-47bb-b58d-23e07d7a0771-kube-api-access-rkb7g\") pod \"7def500c-116f-47bb-b58d-23e07d7a0771\" (UID: \"7def500c-116f-47bb-b58d-23e07d7a0771\") " Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.485928 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7def500c-116f-47bb-b58d-23e07d7a0771-scripts\") pod \"7def500c-116f-47bb-b58d-23e07d7a0771\" (UID: \"7def500c-116f-47bb-b58d-23e07d7a0771\") " Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.492195 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7def500c-116f-47bb-b58d-23e07d7a0771-kube-api-access-rkb7g" (OuterVolumeSpecName: "kube-api-access-rkb7g") pod "7def500c-116f-47bb-b58d-23e07d7a0771" (UID: "7def500c-116f-47bb-b58d-23e07d7a0771"). InnerVolumeSpecName "kube-api-access-rkb7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.493234 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7def500c-116f-47bb-b58d-23e07d7a0771-scripts" (OuterVolumeSpecName: "scripts") pod "7def500c-116f-47bb-b58d-23e07d7a0771" (UID: "7def500c-116f-47bb-b58d-23e07d7a0771"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.519799 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.525408 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7def500c-116f-47bb-b58d-23e07d7a0771-config-data" (OuterVolumeSpecName: "config-data") pod "7def500c-116f-47bb-b58d-23e07d7a0771" (UID: "7def500c-116f-47bb-b58d-23e07d7a0771"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.532797 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7def500c-116f-47bb-b58d-23e07d7a0771-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7def500c-116f-47bb-b58d-23e07d7a0771" (UID: "7def500c-116f-47bb-b58d-23e07d7a0771"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.587425 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-combined-ca-bundle\") pod \"69841b17-abb2-40af-9923-d61b4818c475\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.587475 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-config-data\") pod \"69841b17-abb2-40af-9923-d61b4818c475\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.587515 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpjlr\" (UniqueName: \"kubernetes.io/projected/69841b17-abb2-40af-9923-d61b4818c475-kube-api-access-rpjlr\") pod \"69841b17-abb2-40af-9923-d61b4818c475\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.587532 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-sg-core-conf-yaml\") pod \"69841b17-abb2-40af-9923-d61b4818c475\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.587598 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69841b17-abb2-40af-9923-d61b4818c475-log-httpd\") pod \"69841b17-abb2-40af-9923-d61b4818c475\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.587698 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69841b17-abb2-40af-9923-d61b4818c475-run-httpd\") pod \"69841b17-abb2-40af-9923-d61b4818c475\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.587807 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-scripts\") pod \"69841b17-abb2-40af-9923-d61b4818c475\" (UID: \"69841b17-abb2-40af-9923-d61b4818c475\") " Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.588200 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7def500c-116f-47bb-b58d-23e07d7a0771-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.588217 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkb7g\" (UniqueName: \"kubernetes.io/projected/7def500c-116f-47bb-b58d-23e07d7a0771-kube-api-access-rkb7g\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.588229 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7def500c-116f-47bb-b58d-23e07d7a0771-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.588239 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7def500c-116f-47bb-b58d-23e07d7a0771-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.588562 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69841b17-abb2-40af-9923-d61b4818c475-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "69841b17-abb2-40af-9923-d61b4818c475" (UID: "69841b17-abb2-40af-9923-d61b4818c475"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.588720 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69841b17-abb2-40af-9923-d61b4818c475-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "69841b17-abb2-40af-9923-d61b4818c475" (UID: "69841b17-abb2-40af-9923-d61b4818c475"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.590662 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-scripts" (OuterVolumeSpecName: "scripts") pod "69841b17-abb2-40af-9923-d61b4818c475" (UID: "69841b17-abb2-40af-9923-d61b4818c475"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.593188 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69841b17-abb2-40af-9923-d61b4818c475-kube-api-access-rpjlr" (OuterVolumeSpecName: "kube-api-access-rpjlr") pod "69841b17-abb2-40af-9923-d61b4818c475" (UID: "69841b17-abb2-40af-9923-d61b4818c475"). InnerVolumeSpecName "kube-api-access-rpjlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.611684 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "69841b17-abb2-40af-9923-d61b4818c475" (UID: "69841b17-abb2-40af-9923-d61b4818c475"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.672093 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69841b17-abb2-40af-9923-d61b4818c475" (UID: "69841b17-abb2-40af-9923-d61b4818c475"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.691783 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69841b17-abb2-40af-9923-d61b4818c475-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.691813 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.691822 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.691833 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpjlr\" (UniqueName: \"kubernetes.io/projected/69841b17-abb2-40af-9923-d61b4818c475-kube-api-access-rpjlr\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.691842 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.691850 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69841b17-abb2-40af-9923-d61b4818c475-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.699453 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-config-data" (OuterVolumeSpecName: "config-data") pod "69841b17-abb2-40af-9923-d61b4818c475" (UID: "69841b17-abb2-40af-9923-d61b4818c475"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.793641 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69841b17-abb2-40af-9923-d61b4818c475-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.867049 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-24ghd" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.867142 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-24ghd" event={"ID":"7def500c-116f-47bb-b58d-23e07d7a0771","Type":"ContainerDied","Data":"3332fcf3631eebb4f0af87e9101e51d14c913bcd03c0415ed555d3633b61a6e5"} Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.867209 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3332fcf3631eebb4f0af87e9101e51d14c913bcd03c0415ed555d3633b61a6e5" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.870975 4841 generic.go:334] "Generic (PLEG): container finished" podID="69841b17-abb2-40af-9923-d61b4818c475" containerID="9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311" exitCode=0 Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.871018 4841 generic.go:334] "Generic (PLEG): container finished" podID="69841b17-abb2-40af-9923-d61b4818c475" containerID="98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b" exitCode=2 Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.871034 4841 generic.go:334] "Generic (PLEG): container finished" podID="69841b17-abb2-40af-9923-d61b4818c475" containerID="f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4" exitCode=0 Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.871048 4841 generic.go:334] "Generic (PLEG): container finished" podID="69841b17-abb2-40af-9923-d61b4818c475" containerID="2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4" exitCode=0 Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.871078 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69841b17-abb2-40af-9923-d61b4818c475","Type":"ContainerDied","Data":"9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311"} Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.871091 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.871157 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69841b17-abb2-40af-9923-d61b4818c475","Type":"ContainerDied","Data":"98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b"} Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.871203 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69841b17-abb2-40af-9923-d61b4818c475","Type":"ContainerDied","Data":"f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4"} Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.871213 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69841b17-abb2-40af-9923-d61b4818c475","Type":"ContainerDied","Data":"2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4"} Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.871223 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69841b17-abb2-40af-9923-d61b4818c475","Type":"ContainerDied","Data":"fe4ac005638103613f45ecd2f87d483ba9c8073eb91de4b7ea01f362b13a0de2"} Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.871242 4841 scope.go:117] "RemoveContainer" containerID="9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.892680 4841 scope.go:117] "RemoveContainer" containerID="98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.923768 4841 scope.go:117] "RemoveContainer" containerID="f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.927656 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.942046 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.964336 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:33:37 crc kubenswrapper[4841]: E0313 09:33:37.965155 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7def500c-116f-47bb-b58d-23e07d7a0771" containerName="nova-cell0-conductor-db-sync" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.965292 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7def500c-116f-47bb-b58d-23e07d7a0771" containerName="nova-cell0-conductor-db-sync" Mar 13 09:33:37 crc kubenswrapper[4841]: E0313 09:33:37.965378 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69841b17-abb2-40af-9923-d61b4818c475" containerName="proxy-httpd" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.965469 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="69841b17-abb2-40af-9923-d61b4818c475" containerName="proxy-httpd" Mar 13 09:33:37 crc kubenswrapper[4841]: E0313 09:33:37.965580 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69841b17-abb2-40af-9923-d61b4818c475" containerName="sg-core" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.965660 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="69841b17-abb2-40af-9923-d61b4818c475" containerName="sg-core" Mar 13 09:33:37 crc kubenswrapper[4841]: E0313 09:33:37.965741 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69841b17-abb2-40af-9923-d61b4818c475" containerName="ceilometer-central-agent" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.965821 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="69841b17-abb2-40af-9923-d61b4818c475" containerName="ceilometer-central-agent" Mar 13 09:33:37 crc kubenswrapper[4841]: E0313 09:33:37.965909 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69841b17-abb2-40af-9923-d61b4818c475" containerName="ceilometer-notification-agent" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.965981 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="69841b17-abb2-40af-9923-d61b4818c475" containerName="ceilometer-notification-agent" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.966319 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="69841b17-abb2-40af-9923-d61b4818c475" containerName="proxy-httpd" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.966422 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7def500c-116f-47bb-b58d-23e07d7a0771" containerName="nova-cell0-conductor-db-sync" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.966505 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="69841b17-abb2-40af-9923-d61b4818c475" containerName="ceilometer-central-agent" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.966586 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="69841b17-abb2-40af-9923-d61b4818c475" containerName="ceilometer-notification-agent" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.966681 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="69841b17-abb2-40af-9923-d61b4818c475" containerName="sg-core" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.968774 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.971692 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.971997 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 09:33:37 crc kubenswrapper[4841]: I0313 09:33:37.981737 4841 scope.go:117] "RemoveContainer" containerID="2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.015495 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69841b17-abb2-40af-9923-d61b4818c475" path="/var/lib/kubelet/pods/69841b17-abb2-40af-9923-d61b4818c475/volumes" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.016213 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.022158 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.023493 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.025687 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2zgvk" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.026088 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.044103 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.056045 4841 scope.go:117] "RemoveContainer" containerID="9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311" Mar 13 09:33:38 crc kubenswrapper[4841]: E0313 09:33:38.056573 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311\": container with ID starting with 9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311 not found: ID does not exist" containerID="9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.056604 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311"} err="failed to get container status \"9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311\": rpc error: code = NotFound desc = could not find container \"9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311\": container with ID starting with 9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311 not found: ID does not exist" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.056624 4841 scope.go:117] "RemoveContainer" containerID="98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b" Mar 13 09:33:38 crc kubenswrapper[4841]: E0313 09:33:38.056952 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b\": container with ID starting with 98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b not found: ID does not exist" containerID="98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.057060 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b"} err="failed to get container status \"98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b\": rpc error: code = NotFound desc = could not find container \"98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b\": container with ID starting with 98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b not found: ID does not exist" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.057178 4841 scope.go:117] "RemoveContainer" containerID="f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4" Mar 13 09:33:38 crc kubenswrapper[4841]: E0313 09:33:38.057530 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4\": container with ID starting with f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4 not found: ID does not exist" containerID="f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.057560 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4"} err="failed to get container status \"f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4\": rpc error: code = NotFound desc = could not find container \"f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4\": container with ID starting with f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4 not found: ID does not exist" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.057578 4841 scope.go:117] "RemoveContainer" containerID="2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4" Mar 13 09:33:38 crc kubenswrapper[4841]: E0313 09:33:38.059239 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4\": container with ID starting with 2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4 not found: ID does not exist" containerID="2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.059360 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4"} err="failed to get container status \"2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4\": rpc error: code = NotFound desc = could not find container \"2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4\": container with ID starting with 2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4 not found: ID does not exist" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.059437 4841 scope.go:117] "RemoveContainer" containerID="9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.059986 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311"} err="failed to get container status \"9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311\": rpc error: code = NotFound desc = could not find container \"9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311\": container with ID starting with 9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311 not found: ID does not exist" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.060121 4841 scope.go:117] "RemoveContainer" containerID="98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.060455 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b"} err="failed to get container status \"98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b\": rpc error: code = NotFound desc = could not find container \"98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b\": container with ID starting with 98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b not found: ID does not exist" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.060542 4841 scope.go:117] "RemoveContainer" containerID="f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.060913 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4"} err="failed to get container status \"f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4\": rpc error: code = NotFound desc = could not find container \"f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4\": container with ID starting with f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4 not found: ID does not exist" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.061007 4841 scope.go:117] "RemoveContainer" containerID="2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.061422 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4"} err="failed to get container status \"2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4\": rpc error: code = NotFound desc = could not find container \"2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4\": container with ID starting with 2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4 not found: ID does not exist" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.061478 4841 scope.go:117] "RemoveContainer" containerID="9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.062057 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311"} err="failed to get container status \"9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311\": rpc error: code = NotFound desc = could not find container \"9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311\": container with ID starting with 9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311 not found: ID does not exist" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.062085 4841 scope.go:117] "RemoveContainer" containerID="98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.062333 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b"} err="failed to get container status \"98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b\": rpc error: code = NotFound desc = could not find container \"98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b\": container with ID starting with 98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b not found: ID does not exist" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.062349 4841 scope.go:117] "RemoveContainer" containerID="f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.062612 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4"} err="failed to get container status \"f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4\": rpc error: code = NotFound desc = could not find container \"f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4\": container with ID starting with f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4 not found: ID does not exist" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.062723 4841 scope.go:117] "RemoveContainer" containerID="2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.063013 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4"} err="failed to get container status \"2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4\": rpc error: code = NotFound desc = could not find container \"2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4\": container with ID starting with 2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4 not found: ID does not exist" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.063114 4841 scope.go:117] "RemoveContainer" containerID="9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.063437 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311"} err="failed to get container status \"9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311\": rpc error: code = NotFound desc = could not find container \"9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311\": container with ID starting with 9cab404dc5a4c1df85c124e2e533b5b3a925906bacf9e898e08c28af7d8af311 not found: ID does not exist" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.063563 4841 scope.go:117] "RemoveContainer" containerID="98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.063878 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b"} err="failed to get container status \"98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b\": rpc error: code = NotFound desc = could not find container \"98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b\": container with ID starting with 98745a13fc1eab0553f1402f4392652d770e1f4a10e56cab3f76bfbb96f4d74b not found: ID does not exist" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.064007 4841 scope.go:117] "RemoveContainer" containerID="f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.064358 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4"} err="failed to get container status \"f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4\": rpc error: code = NotFound desc = could not find container \"f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4\": container with ID starting with f95cc289b8527f72d79018191539dfa333c6805ca527050c845a0b0123031cc4 not found: ID does not exist" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.064384 4841 scope.go:117] "RemoveContainer" containerID="2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.064620 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4"} err="failed to get container status \"2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4\": rpc error: code = NotFound desc = could not find container \"2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4\": container with ID starting with 2b100d73697c0fb4e53b8869802ca2cf92002e17498b84357f9aa0c26fb4b6f4 not found: ID does not exist" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.102802 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb595f57-6b23-4ac9-b25a-3d63159349ad-run-httpd\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.102906 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96d3458e-2994-4e4a-97b9-738366b67d8e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"96d3458e-2994-4e4a-97b9-738366b67d8e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.102981 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96d3458e-2994-4e4a-97b9-738366b67d8e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"96d3458e-2994-4e4a-97b9-738366b67d8e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.103014 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.103033 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6pj8\" (UniqueName: \"kubernetes.io/projected/fb595f57-6b23-4ac9-b25a-3d63159349ad-kube-api-access-b6pj8\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.103064 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-config-data\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.103112 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-scripts\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.103140 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwr96\" (UniqueName: \"kubernetes.io/projected/96d3458e-2994-4e4a-97b9-738366b67d8e-kube-api-access-lwr96\") pod \"nova-cell0-conductor-0\" (UID: \"96d3458e-2994-4e4a-97b9-738366b67d8e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.103219 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.103256 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb595f57-6b23-4ac9-b25a-3d63159349ad-log-httpd\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.204931 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.204988 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb595f57-6b23-4ac9-b25a-3d63159349ad-log-httpd\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.205019 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb595f57-6b23-4ac9-b25a-3d63159349ad-run-httpd\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.205038 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96d3458e-2994-4e4a-97b9-738366b67d8e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"96d3458e-2994-4e4a-97b9-738366b67d8e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.205072 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96d3458e-2994-4e4a-97b9-738366b67d8e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"96d3458e-2994-4e4a-97b9-738366b67d8e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.205097 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6pj8\" (UniqueName: \"kubernetes.io/projected/fb595f57-6b23-4ac9-b25a-3d63159349ad-kube-api-access-b6pj8\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.205112 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.205134 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-config-data\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.205166 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-scripts\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.205190 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwr96\" (UniqueName: \"kubernetes.io/projected/96d3458e-2994-4e4a-97b9-738366b67d8e-kube-api-access-lwr96\") pod \"nova-cell0-conductor-0\" (UID: \"96d3458e-2994-4e4a-97b9-738366b67d8e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.206565 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb595f57-6b23-4ac9-b25a-3d63159349ad-log-httpd\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.206702 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb595f57-6b23-4ac9-b25a-3d63159349ad-run-httpd\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.211003 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-scripts\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.211118 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96d3458e-2994-4e4a-97b9-738366b67d8e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"96d3458e-2994-4e4a-97b9-738366b67d8e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.211438 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.213033 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-config-data\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.215798 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96d3458e-2994-4e4a-97b9-738366b67d8e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"96d3458e-2994-4e4a-97b9-738366b67d8e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.220870 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.221768 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6pj8\" (UniqueName: \"kubernetes.io/projected/fb595f57-6b23-4ac9-b25a-3d63159349ad-kube-api-access-b6pj8\") pod \"ceilometer-0\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.231988 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwr96\" (UniqueName: \"kubernetes.io/projected/96d3458e-2994-4e4a-97b9-738366b67d8e-kube-api-access-lwr96\") pod \"nova-cell0-conductor-0\" (UID: \"96d3458e-2994-4e4a-97b9-738366b67d8e\") " pod="openstack/nova-cell0-conductor-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.364131 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.372473 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 09:33:38 crc kubenswrapper[4841]: W0313 09:33:38.826669 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb595f57_6b23_4ac9_b25a_3d63159349ad.slice/crio-74f52791398ce0a76bb374f4247b414c40dd8fe699543141c3002bdfe23920ac WatchSource:0}: Error finding container 74f52791398ce0a76bb374f4247b414c40dd8fe699543141c3002bdfe23920ac: Status 404 returned error can't find the container with id 74f52791398ce0a76bb374f4247b414c40dd8fe699543141c3002bdfe23920ac Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.831401 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.883233 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb595f57-6b23-4ac9-b25a-3d63159349ad","Type":"ContainerStarted","Data":"74f52791398ce0a76bb374f4247b414c40dd8fe699543141c3002bdfe23920ac"} Mar 13 09:33:38 crc kubenswrapper[4841]: I0313 09:33:38.909620 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 09:33:38 crc kubenswrapper[4841]: W0313 09:33:38.910150 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96d3458e_2994_4e4a_97b9_738366b67d8e.slice/crio-6c923c96628cb0ee7705b85a1aa448107d1280c7dfe4c9b9130ed08f7752434c WatchSource:0}: Error finding container 6c923c96628cb0ee7705b85a1aa448107d1280c7dfe4c9b9130ed08f7752434c: Status 404 returned error can't find the container with id 6c923c96628cb0ee7705b85a1aa448107d1280c7dfe4c9b9130ed08f7752434c Mar 13 09:33:39 crc kubenswrapper[4841]: I0313 09:33:39.891789 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"96d3458e-2994-4e4a-97b9-738366b67d8e","Type":"ContainerStarted","Data":"b87b84a419f026cba8c8e92ce6b7cc666ddb8409204178c933e49a8707f86d31"} Mar 13 09:33:39 crc kubenswrapper[4841]: I0313 09:33:39.892108 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"96d3458e-2994-4e4a-97b9-738366b67d8e","Type":"ContainerStarted","Data":"6c923c96628cb0ee7705b85a1aa448107d1280c7dfe4c9b9130ed08f7752434c"} Mar 13 09:33:39 crc kubenswrapper[4841]: I0313 09:33:39.892158 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 13 09:33:39 crc kubenswrapper[4841]: I0313 09:33:39.894122 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb595f57-6b23-4ac9-b25a-3d63159349ad","Type":"ContainerStarted","Data":"9bce5e1595158aa284fda9a35039119d9d367974e7f318e860bd9bbc3d94c1b0"} Mar 13 09:33:39 crc kubenswrapper[4841]: I0313 09:33:39.914867 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.914849116 podStartE2EDuration="2.914849116s" podCreationTimestamp="2026-03-13 09:33:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:33:39.909163888 +0000 UTC m=+1302.639064089" watchObservedRunningTime="2026-03-13 09:33:39.914849116 +0000 UTC m=+1302.644749307" Mar 13 09:33:40 crc kubenswrapper[4841]: I0313 09:33:40.907727 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb595f57-6b23-4ac9-b25a-3d63159349ad","Type":"ContainerStarted","Data":"928d5bedbb474d3290d23c97d2a184507b4b71f55edf64936973c86202a9d5a8"} Mar 13 09:33:40 crc kubenswrapper[4841]: I0313 09:33:40.908335 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb595f57-6b23-4ac9-b25a-3d63159349ad","Type":"ContainerStarted","Data":"32cb874977afd54dfe0c446521e716f3e45a3cd182c579fdb9d7ed0696e369f6"} Mar 13 09:33:42 crc kubenswrapper[4841]: I0313 09:33:42.930833 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb595f57-6b23-4ac9-b25a-3d63159349ad","Type":"ContainerStarted","Data":"e9ec10c9fb528c6843eca05c975eccb635bba35d0ea2b36da5197939ca9d0f98"} Mar 13 09:33:42 crc kubenswrapper[4841]: I0313 09:33:42.931533 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 09:33:48 crc kubenswrapper[4841]: I0313 09:33:48.424286 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 13 09:33:48 crc kubenswrapper[4841]: I0313 09:33:48.455929 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.947825497 podStartE2EDuration="11.455898841s" podCreationTimestamp="2026-03-13 09:33:37 +0000 UTC" firstStartedPulling="2026-03-13 09:33:38.829053328 +0000 UTC m=+1301.558953519" lastFinishedPulling="2026-03-13 09:33:42.337126672 +0000 UTC m=+1305.067026863" observedRunningTime="2026-03-13 09:33:42.962119452 +0000 UTC m=+1305.692019673" watchObservedRunningTime="2026-03-13 09:33:48.455898841 +0000 UTC m=+1311.185799022" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.035626 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-rc4xv"] Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.037049 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rc4xv" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.039312 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.041188 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.044489 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-rc4xv"] Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.060987 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b05d61d-3de0-4314-b5f3-07a447bc3465-config-data\") pod \"nova-cell0-cell-mapping-rc4xv\" (UID: \"1b05d61d-3de0-4314-b5f3-07a447bc3465\") " pod="openstack/nova-cell0-cell-mapping-rc4xv" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.061041 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b05d61d-3de0-4314-b5f3-07a447bc3465-scripts\") pod \"nova-cell0-cell-mapping-rc4xv\" (UID: \"1b05d61d-3de0-4314-b5f3-07a447bc3465\") " pod="openstack/nova-cell0-cell-mapping-rc4xv" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.061086 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b05d61d-3de0-4314-b5f3-07a447bc3465-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rc4xv\" (UID: \"1b05d61d-3de0-4314-b5f3-07a447bc3465\") " pod="openstack/nova-cell0-cell-mapping-rc4xv" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.061129 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n8f2\" (UniqueName: \"kubernetes.io/projected/1b05d61d-3de0-4314-b5f3-07a447bc3465-kube-api-access-7n8f2\") pod \"nova-cell0-cell-mapping-rc4xv\" (UID: \"1b05d61d-3de0-4314-b5f3-07a447bc3465\") " pod="openstack/nova-cell0-cell-mapping-rc4xv" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.163576 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b05d61d-3de0-4314-b5f3-07a447bc3465-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rc4xv\" (UID: \"1b05d61d-3de0-4314-b5f3-07a447bc3465\") " pod="openstack/nova-cell0-cell-mapping-rc4xv" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.163764 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n8f2\" (UniqueName: \"kubernetes.io/projected/1b05d61d-3de0-4314-b5f3-07a447bc3465-kube-api-access-7n8f2\") pod \"nova-cell0-cell-mapping-rc4xv\" (UID: \"1b05d61d-3de0-4314-b5f3-07a447bc3465\") " pod="openstack/nova-cell0-cell-mapping-rc4xv" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.164104 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b05d61d-3de0-4314-b5f3-07a447bc3465-config-data\") pod \"nova-cell0-cell-mapping-rc4xv\" (UID: \"1b05d61d-3de0-4314-b5f3-07a447bc3465\") " pod="openstack/nova-cell0-cell-mapping-rc4xv" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.164220 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b05d61d-3de0-4314-b5f3-07a447bc3465-scripts\") pod \"nova-cell0-cell-mapping-rc4xv\" (UID: \"1b05d61d-3de0-4314-b5f3-07a447bc3465\") " pod="openstack/nova-cell0-cell-mapping-rc4xv" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.169302 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b05d61d-3de0-4314-b5f3-07a447bc3465-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rc4xv\" (UID: \"1b05d61d-3de0-4314-b5f3-07a447bc3465\") " pod="openstack/nova-cell0-cell-mapping-rc4xv" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.189829 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.190978 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.194557 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b05d61d-3de0-4314-b5f3-07a447bc3465-config-data\") pod \"nova-cell0-cell-mapping-rc4xv\" (UID: \"1b05d61d-3de0-4314-b5f3-07a447bc3465\") " pod="openstack/nova-cell0-cell-mapping-rc4xv" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.194663 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b05d61d-3de0-4314-b5f3-07a447bc3465-scripts\") pod \"nova-cell0-cell-mapping-rc4xv\" (UID: \"1b05d61d-3de0-4314-b5f3-07a447bc3465\") " pod="openstack/nova-cell0-cell-mapping-rc4xv" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.194869 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.200052 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n8f2\" (UniqueName: \"kubernetes.io/projected/1b05d61d-3de0-4314-b5f3-07a447bc3465-kube-api-access-7n8f2\") pod \"nova-cell0-cell-mapping-rc4xv\" (UID: \"1b05d61d-3de0-4314-b5f3-07a447bc3465\") " pod="openstack/nova-cell0-cell-mapping-rc4xv" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.231748 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.266752 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/867ddec1-6a02-4749-8cf0-259eff17fbd5-config-data\") pod \"nova-scheduler-0\" (UID: \"867ddec1-6a02-4749-8cf0-259eff17fbd5\") " pod="openstack/nova-scheduler-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.266801 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867ddec1-6a02-4749-8cf0-259eff17fbd5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"867ddec1-6a02-4749-8cf0-259eff17fbd5\") " pod="openstack/nova-scheduler-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.266904 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlf4t\" (UniqueName: \"kubernetes.io/projected/867ddec1-6a02-4749-8cf0-259eff17fbd5-kube-api-access-nlf4t\") pod \"nova-scheduler-0\" (UID: \"867ddec1-6a02-4749-8cf0-259eff17fbd5\") " pod="openstack/nova-scheduler-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.295954 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.297347 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.302438 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.321125 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.357491 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.359050 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.362725 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rc4xv" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.365694 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.367677 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2wlk\" (UniqueName: \"kubernetes.io/projected/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-kube-api-access-p2wlk\") pod \"nova-api-0\" (UID: \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\") " pod="openstack/nova-api-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.367726 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-logs\") pod \"nova-api-0\" (UID: \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\") " pod="openstack/nova-api-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.367782 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.367850 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlf4t\" (UniqueName: \"kubernetes.io/projected/867ddec1-6a02-4749-8cf0-259eff17fbd5-kube-api-access-nlf4t\") pod \"nova-scheduler-0\" (UID: \"867ddec1-6a02-4749-8cf0-259eff17fbd5\") " pod="openstack/nova-scheduler-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.367907 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-config-data\") pod \"nova-api-0\" (UID: \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\") " pod="openstack/nova-api-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.367944 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.367978 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/867ddec1-6a02-4749-8cf0-259eff17fbd5-config-data\") pod \"nova-scheduler-0\" (UID: \"867ddec1-6a02-4749-8cf0-259eff17fbd5\") " pod="openstack/nova-scheduler-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.368005 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867ddec1-6a02-4749-8cf0-259eff17fbd5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"867ddec1-6a02-4749-8cf0-259eff17fbd5\") " pod="openstack/nova-scheduler-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.368026 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\") " pod="openstack/nova-api-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.368043 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkgsf\" (UniqueName: \"kubernetes.io/projected/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5-kube-api-access-bkgsf\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.372121 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867ddec1-6a02-4749-8cf0-259eff17fbd5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"867ddec1-6a02-4749-8cf0-259eff17fbd5\") " pod="openstack/nova-scheduler-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.382134 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.386828 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/867ddec1-6a02-4749-8cf0-259eff17fbd5-config-data\") pod \"nova-scheduler-0\" (UID: \"867ddec1-6a02-4749-8cf0-259eff17fbd5\") " pod="openstack/nova-scheduler-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.435068 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlf4t\" (UniqueName: \"kubernetes.io/projected/867ddec1-6a02-4749-8cf0-259eff17fbd5-kube-api-access-nlf4t\") pod \"nova-scheduler-0\" (UID: \"867ddec1-6a02-4749-8cf0-259eff17fbd5\") " pod="openstack/nova-scheduler-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.456287 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.462584 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.469588 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\") " pod="openstack/nova-api-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.469631 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkgsf\" (UniqueName: \"kubernetes.io/projected/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5-kube-api-access-bkgsf\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.469658 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2wlk\" (UniqueName: \"kubernetes.io/projected/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-kube-api-access-p2wlk\") pod \"nova-api-0\" (UID: \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\") " pod="openstack/nova-api-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.469689 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67aede02-5199-4ec9-82a1-e6af129221e8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"67aede02-5199-4ec9-82a1-e6af129221e8\") " pod="openstack/nova-metadata-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.469741 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-logs\") pod \"nova-api-0\" (UID: \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\") " pod="openstack/nova-api-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.469765 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67aede02-5199-4ec9-82a1-e6af129221e8-config-data\") pod \"nova-metadata-0\" (UID: \"67aede02-5199-4ec9-82a1-e6af129221e8\") " pod="openstack/nova-metadata-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.469812 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.469861 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67aede02-5199-4ec9-82a1-e6af129221e8-logs\") pod \"nova-metadata-0\" (UID: \"67aede02-5199-4ec9-82a1-e6af129221e8\") " pod="openstack/nova-metadata-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.469923 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v678f\" (UniqueName: \"kubernetes.io/projected/67aede02-5199-4ec9-82a1-e6af129221e8-kube-api-access-v678f\") pod \"nova-metadata-0\" (UID: \"67aede02-5199-4ec9-82a1-e6af129221e8\") " pod="openstack/nova-metadata-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.469965 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-config-data\") pod \"nova-api-0\" (UID: \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\") " pod="openstack/nova-api-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.469995 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.471601 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-logs\") pod \"nova-api-0\" (UID: \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\") " pod="openstack/nova-api-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.483079 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.486772 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.493606 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\") " pod="openstack/nova-api-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.494258 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.497693 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-config-data\") pod \"nova-api-0\" (UID: \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\") " pod="openstack/nova-api-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.511304 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2wlk\" (UniqueName: \"kubernetes.io/projected/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-kube-api-access-p2wlk\") pod \"nova-api-0\" (UID: \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\") " pod="openstack/nova-api-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.514766 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkgsf\" (UniqueName: \"kubernetes.io/projected/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5-kube-api-access-bkgsf\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.572643 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67aede02-5199-4ec9-82a1-e6af129221e8-logs\") pod \"nova-metadata-0\" (UID: \"67aede02-5199-4ec9-82a1-e6af129221e8\") " pod="openstack/nova-metadata-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.572708 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v678f\" (UniqueName: \"kubernetes.io/projected/67aede02-5199-4ec9-82a1-e6af129221e8-kube-api-access-v678f\") pod \"nova-metadata-0\" (UID: \"67aede02-5199-4ec9-82a1-e6af129221e8\") " pod="openstack/nova-metadata-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.572797 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67aede02-5199-4ec9-82a1-e6af129221e8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"67aede02-5199-4ec9-82a1-e6af129221e8\") " pod="openstack/nova-metadata-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.572832 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67aede02-5199-4ec9-82a1-e6af129221e8-config-data\") pod \"nova-metadata-0\" (UID: \"67aede02-5199-4ec9-82a1-e6af129221e8\") " pod="openstack/nova-metadata-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.576661 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.586855 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.596508 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67aede02-5199-4ec9-82a1-e6af129221e8-logs\") pod \"nova-metadata-0\" (UID: \"67aede02-5199-4ec9-82a1-e6af129221e8\") " pod="openstack/nova-metadata-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.597189 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67aede02-5199-4ec9-82a1-e6af129221e8-config-data\") pod \"nova-metadata-0\" (UID: \"67aede02-5199-4ec9-82a1-e6af129221e8\") " pod="openstack/nova-metadata-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.599081 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67aede02-5199-4ec9-82a1-e6af129221e8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"67aede02-5199-4ec9-82a1-e6af129221e8\") " pod="openstack/nova-metadata-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.625283 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.656146 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-zjcgr"] Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.666479 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v678f\" (UniqueName: \"kubernetes.io/projected/67aede02-5199-4ec9-82a1-e6af129221e8-kube-api-access-v678f\") pod \"nova-metadata-0\" (UID: \"67aede02-5199-4ec9-82a1-e6af129221e8\") " pod="openstack/nova-metadata-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.705991 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.787108 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-zjcgr"] Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.822980 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.898088 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.903591 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-config\") pod \"dnsmasq-dns-9b86998b5-zjcgr\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.903680 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-zjcgr\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.903715 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-zjcgr\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.903732 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qq8f\" (UniqueName: \"kubernetes.io/projected/670ff361-affa-44f7-b872-303ba17bb4f4-kube-api-access-2qq8f\") pod \"dnsmasq-dns-9b86998b5-zjcgr\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.903751 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-zjcgr\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:49 crc kubenswrapper[4841]: I0313 09:33:49.903801 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-dns-svc\") pod \"dnsmasq-dns-9b86998b5-zjcgr\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.004987 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-zjcgr\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.005053 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qq8f\" (UniqueName: \"kubernetes.io/projected/670ff361-affa-44f7-b872-303ba17bb4f4-kube-api-access-2qq8f\") pod \"dnsmasq-dns-9b86998b5-zjcgr\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.005079 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-zjcgr\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.005109 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-zjcgr\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.006484 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-zjcgr\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.007249 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-dns-svc\") pod \"dnsmasq-dns-9b86998b5-zjcgr\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.007351 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-config\") pod \"dnsmasq-dns-9b86998b5-zjcgr\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.007583 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-zjcgr\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.008322 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-dns-svc\") pod \"dnsmasq-dns-9b86998b5-zjcgr\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.008878 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-zjcgr\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.010700 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-config\") pod \"dnsmasq-dns-9b86998b5-zjcgr\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.029540 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qq8f\" (UniqueName: \"kubernetes.io/projected/670ff361-affa-44f7-b872-303ba17bb4f4-kube-api-access-2qq8f\") pod \"dnsmasq-dns-9b86998b5-zjcgr\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.137793 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.143684 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-rc4xv"] Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.264419 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.371386 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.396819 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.439434 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xrtxl"] Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.440619 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xrtxl" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.445196 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.445408 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.459134 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xrtxl"] Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.539067 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.625318 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xrtxl\" (UID: \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\") " pod="openstack/nova-cell1-conductor-db-sync-xrtxl" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.625701 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6ktf\" (UniqueName: \"kubernetes.io/projected/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-kube-api-access-g6ktf\") pod \"nova-cell1-conductor-db-sync-xrtxl\" (UID: \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\") " pod="openstack/nova-cell1-conductor-db-sync-xrtxl" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.625958 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-config-data\") pod \"nova-cell1-conductor-db-sync-xrtxl\" (UID: \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\") " pod="openstack/nova-cell1-conductor-db-sync-xrtxl" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.626001 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-scripts\") pod \"nova-cell1-conductor-db-sync-xrtxl\" (UID: \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\") " pod="openstack/nova-cell1-conductor-db-sync-xrtxl" Mar 13 09:33:50 crc kubenswrapper[4841]: W0313 09:33:50.641056 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod670ff361_affa_44f7_b872_303ba17bb4f4.slice/crio-b2414db49ba4594e6e684557f078ce6a7a1c6e380aa4654491ecaf54d454a6f7 WatchSource:0}: Error finding container b2414db49ba4594e6e684557f078ce6a7a1c6e380aa4654491ecaf54d454a6f7: Status 404 returned error can't find the container with id b2414db49ba4594e6e684557f078ce6a7a1c6e380aa4654491ecaf54d454a6f7 Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.646242 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-zjcgr"] Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.728143 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xrtxl\" (UID: \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\") " pod="openstack/nova-cell1-conductor-db-sync-xrtxl" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.728207 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6ktf\" (UniqueName: \"kubernetes.io/projected/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-kube-api-access-g6ktf\") pod \"nova-cell1-conductor-db-sync-xrtxl\" (UID: \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\") " pod="openstack/nova-cell1-conductor-db-sync-xrtxl" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.728344 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-config-data\") pod \"nova-cell1-conductor-db-sync-xrtxl\" (UID: \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\") " pod="openstack/nova-cell1-conductor-db-sync-xrtxl" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.728375 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-scripts\") pod \"nova-cell1-conductor-db-sync-xrtxl\" (UID: \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\") " pod="openstack/nova-cell1-conductor-db-sync-xrtxl" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.734827 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xrtxl\" (UID: \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\") " pod="openstack/nova-cell1-conductor-db-sync-xrtxl" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.734845 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-config-data\") pod \"nova-cell1-conductor-db-sync-xrtxl\" (UID: \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\") " pod="openstack/nova-cell1-conductor-db-sync-xrtxl" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.735605 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-scripts\") pod \"nova-cell1-conductor-db-sync-xrtxl\" (UID: \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\") " pod="openstack/nova-cell1-conductor-db-sync-xrtxl" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.753887 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6ktf\" (UniqueName: \"kubernetes.io/projected/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-kube-api-access-g6ktf\") pod \"nova-cell1-conductor-db-sync-xrtxl\" (UID: \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\") " pod="openstack/nova-cell1-conductor-db-sync-xrtxl" Mar 13 09:33:50 crc kubenswrapper[4841]: I0313 09:33:50.764834 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xrtxl" Mar 13 09:33:51 crc kubenswrapper[4841]: I0313 09:33:51.026277 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6","Type":"ContainerStarted","Data":"ac496c414a94831c32345ae628dd0af1278b63529b13a9ee62120a32e396e75f"} Mar 13 09:33:51 crc kubenswrapper[4841]: I0313 09:33:51.030531 4841 generic.go:334] "Generic (PLEG): container finished" podID="670ff361-affa-44f7-b872-303ba17bb4f4" containerID="2ade612b8fb47bd7d9268a94de2da473e20ea71789bfe440788b6e77ec7c883c" exitCode=0 Mar 13 09:33:51 crc kubenswrapper[4841]: I0313 09:33:51.030588 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" event={"ID":"670ff361-affa-44f7-b872-303ba17bb4f4","Type":"ContainerDied","Data":"2ade612b8fb47bd7d9268a94de2da473e20ea71789bfe440788b6e77ec7c883c"} Mar 13 09:33:51 crc kubenswrapper[4841]: I0313 09:33:51.030608 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" event={"ID":"670ff361-affa-44f7-b872-303ba17bb4f4","Type":"ContainerStarted","Data":"b2414db49ba4594e6e684557f078ce6a7a1c6e380aa4654491ecaf54d454a6f7"} Mar 13 09:33:51 crc kubenswrapper[4841]: I0313 09:33:51.040848 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rc4xv" event={"ID":"1b05d61d-3de0-4314-b5f3-07a447bc3465","Type":"ContainerStarted","Data":"35f357a42ea488cde71740138e6febbe553c221a7357e73b3001c8c27032231e"} Mar 13 09:33:51 crc kubenswrapper[4841]: I0313 09:33:51.040929 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rc4xv" event={"ID":"1b05d61d-3de0-4314-b5f3-07a447bc3465","Type":"ContainerStarted","Data":"0e0ba135ff1ef5ad1aa737bd5b92357956e3c34207f596c39b9db966ca1867e5"} Mar 13 09:33:51 crc kubenswrapper[4841]: I0313 09:33:51.044107 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"67aede02-5199-4ec9-82a1-e6af129221e8","Type":"ContainerStarted","Data":"5e217f2a3986678acecd06c44e8ba9f98385fe32bcbaf74ed34f9b69e5ca7414"} Mar 13 09:33:51 crc kubenswrapper[4841]: I0313 09:33:51.045844 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"867ddec1-6a02-4749-8cf0-259eff17fbd5","Type":"ContainerStarted","Data":"df8f769d093c2d3b3078340cd11be2c218b12da6827d73b78940454c0ca0c52d"} Mar 13 09:33:51 crc kubenswrapper[4841]: I0313 09:33:51.055363 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5","Type":"ContainerStarted","Data":"ac1c142759006713d3b3f9f8410508cb5772666398f5948394a1e48fa17e43fb"} Mar 13 09:33:51 crc kubenswrapper[4841]: I0313 09:33:51.104714 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-rc4xv" podStartSLOduration=2.104689534 podStartE2EDuration="2.104689534s" podCreationTimestamp="2026-03-13 09:33:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:33:51.065608485 +0000 UTC m=+1313.795508686" watchObservedRunningTime="2026-03-13 09:33:51.104689534 +0000 UTC m=+1313.834589725" Mar 13 09:33:51 crc kubenswrapper[4841]: I0313 09:33:51.289848 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xrtxl"] Mar 13 09:33:52 crc kubenswrapper[4841]: I0313 09:33:52.071826 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" event={"ID":"670ff361-affa-44f7-b872-303ba17bb4f4","Type":"ContainerStarted","Data":"0bbf134e0b93cea1efe344431835ec6357d0c9c7cfe326ef5858d3182d6c239f"} Mar 13 09:33:52 crc kubenswrapper[4841]: I0313 09:33:52.072187 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:33:52 crc kubenswrapper[4841]: I0313 09:33:52.088347 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xrtxl" event={"ID":"e9f4bb2d-0844-4e66-8bc3-623168e07b9d","Type":"ContainerStarted","Data":"fa082b44c04a30b03f11db32a7fe268cae8b649f8398efae03159a38e74fbbc2"} Mar 13 09:33:52 crc kubenswrapper[4841]: I0313 09:33:52.088413 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xrtxl" event={"ID":"e9f4bb2d-0844-4e66-8bc3-623168e07b9d","Type":"ContainerStarted","Data":"852c8a72fdeb31e758d79426438b175342827c9382bed4e7ad41fe51589f97a4"} Mar 13 09:33:52 crc kubenswrapper[4841]: I0313 09:33:52.103638 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" podStartSLOduration=3.103619001 podStartE2EDuration="3.103619001s" podCreationTimestamp="2026-03-13 09:33:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:33:52.093800465 +0000 UTC m=+1314.823700676" watchObservedRunningTime="2026-03-13 09:33:52.103619001 +0000 UTC m=+1314.833519192" Mar 13 09:33:52 crc kubenswrapper[4841]: I0313 09:33:52.115416 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-xrtxl" podStartSLOduration=2.115400199 podStartE2EDuration="2.115400199s" podCreationTimestamp="2026-03-13 09:33:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:33:52.105113938 +0000 UTC m=+1314.835014129" watchObservedRunningTime="2026-03-13 09:33:52.115400199 +0000 UTC m=+1314.845300390" Mar 13 09:33:53 crc kubenswrapper[4841]: I0313 09:33:53.006904 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 09:33:53 crc kubenswrapper[4841]: I0313 09:33:53.034472 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:33:54 crc kubenswrapper[4841]: I0313 09:33:54.126040 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6","Type":"ContainerStarted","Data":"2a8b0283ff3e9725aef806b4c4fc6c97d933e5c4784cc189b6f3711683df2547"} Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.140894 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"867ddec1-6a02-4749-8cf0-259eff17fbd5","Type":"ContainerStarted","Data":"9f996c7530cc13eb53c38dc0da8ea520c6125b627b48a211c6a5151a8d2123e8"} Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.143036 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5","Type":"ContainerStarted","Data":"9de0d5f74df1552a5305df383535bca63ffd61287636b5abf7bbc907ef6f4ff4"} Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.143160 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9de0d5f74df1552a5305df383535bca63ffd61287636b5abf7bbc907ef6f4ff4" gracePeriod=30 Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.146900 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6","Type":"ContainerStarted","Data":"f705fdde438a54c3dacdf66858c95bdc1688755e39f1bf22626f1c3e8fce7515"} Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.150370 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"67aede02-5199-4ec9-82a1-e6af129221e8","Type":"ContainerStarted","Data":"8c6c924ce9dc6b28d008df6a7182813c4d0857c708000d2b5b355f99bbf443ef"} Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.150611 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"67aede02-5199-4ec9-82a1-e6af129221e8","Type":"ContainerStarted","Data":"62352a48da89ccc6cfd8c19cb5cf8a889c5db51bdd9feb57940f59c69b9679ad"} Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.150652 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="67aede02-5199-4ec9-82a1-e6af129221e8" containerName="nova-metadata-log" containerID="cri-o://8c6c924ce9dc6b28d008df6a7182813c4d0857c708000d2b5b355f99bbf443ef" gracePeriod=30 Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.150853 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="67aede02-5199-4ec9-82a1-e6af129221e8" containerName="nova-metadata-metadata" containerID="cri-o://62352a48da89ccc6cfd8c19cb5cf8a889c5db51bdd9feb57940f59c69b9679ad" gracePeriod=30 Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.169631 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.7086207780000002 podStartE2EDuration="6.169610692s" podCreationTimestamp="2026-03-13 09:33:49 +0000 UTC" firstStartedPulling="2026-03-13 09:33:50.270731025 +0000 UTC m=+1313.000631216" lastFinishedPulling="2026-03-13 09:33:53.731720939 +0000 UTC m=+1316.461621130" observedRunningTime="2026-03-13 09:33:55.15610808 +0000 UTC m=+1317.886008311" watchObservedRunningTime="2026-03-13 09:33:55.169610692 +0000 UTC m=+1317.899510893" Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.187219 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.835742094 podStartE2EDuration="6.187202311s" podCreationTimestamp="2026-03-13 09:33:49 +0000 UTC" firstStartedPulling="2026-03-13 09:33:50.380454288 +0000 UTC m=+1313.110354479" lastFinishedPulling="2026-03-13 09:33:53.731914505 +0000 UTC m=+1316.461814696" observedRunningTime="2026-03-13 09:33:55.174846175 +0000 UTC m=+1317.904746366" watchObservedRunningTime="2026-03-13 09:33:55.187202311 +0000 UTC m=+1317.917102502" Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.212764 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.024663909 podStartE2EDuration="6.212743308s" podCreationTimestamp="2026-03-13 09:33:49 +0000 UTC" firstStartedPulling="2026-03-13 09:33:50.544069753 +0000 UTC m=+1313.273969944" lastFinishedPulling="2026-03-13 09:33:53.732149152 +0000 UTC m=+1316.462049343" observedRunningTime="2026-03-13 09:33:55.205883743 +0000 UTC m=+1317.935783974" watchObservedRunningTime="2026-03-13 09:33:55.212743308 +0000 UTC m=+1317.942643499" Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.226122 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.8681343740000003 podStartE2EDuration="6.226103214s" podCreationTimestamp="2026-03-13 09:33:49 +0000 UTC" firstStartedPulling="2026-03-13 09:33:50.379613752 +0000 UTC m=+1313.109513943" lastFinishedPulling="2026-03-13 09:33:53.737582582 +0000 UTC m=+1316.467482783" observedRunningTime="2026-03-13 09:33:55.224776122 +0000 UTC m=+1317.954676313" watchObservedRunningTime="2026-03-13 09:33:55.226103214 +0000 UTC m=+1317.956003405" Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.726521 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.831678 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v678f\" (UniqueName: \"kubernetes.io/projected/67aede02-5199-4ec9-82a1-e6af129221e8-kube-api-access-v678f\") pod \"67aede02-5199-4ec9-82a1-e6af129221e8\" (UID: \"67aede02-5199-4ec9-82a1-e6af129221e8\") " Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.831979 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67aede02-5199-4ec9-82a1-e6af129221e8-config-data\") pod \"67aede02-5199-4ec9-82a1-e6af129221e8\" (UID: \"67aede02-5199-4ec9-82a1-e6af129221e8\") " Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.832232 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67aede02-5199-4ec9-82a1-e6af129221e8-combined-ca-bundle\") pod \"67aede02-5199-4ec9-82a1-e6af129221e8\" (UID: \"67aede02-5199-4ec9-82a1-e6af129221e8\") " Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.832393 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67aede02-5199-4ec9-82a1-e6af129221e8-logs\") pod \"67aede02-5199-4ec9-82a1-e6af129221e8\" (UID: \"67aede02-5199-4ec9-82a1-e6af129221e8\") " Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.832804 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67aede02-5199-4ec9-82a1-e6af129221e8-logs" (OuterVolumeSpecName: "logs") pod "67aede02-5199-4ec9-82a1-e6af129221e8" (UID: "67aede02-5199-4ec9-82a1-e6af129221e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.833006 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67aede02-5199-4ec9-82a1-e6af129221e8-logs\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.842955 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67aede02-5199-4ec9-82a1-e6af129221e8-kube-api-access-v678f" (OuterVolumeSpecName: "kube-api-access-v678f") pod "67aede02-5199-4ec9-82a1-e6af129221e8" (UID: "67aede02-5199-4ec9-82a1-e6af129221e8"). InnerVolumeSpecName "kube-api-access-v678f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.891709 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67aede02-5199-4ec9-82a1-e6af129221e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67aede02-5199-4ec9-82a1-e6af129221e8" (UID: "67aede02-5199-4ec9-82a1-e6af129221e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.909998 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67aede02-5199-4ec9-82a1-e6af129221e8-config-data" (OuterVolumeSpecName: "config-data") pod "67aede02-5199-4ec9-82a1-e6af129221e8" (UID: "67aede02-5199-4ec9-82a1-e6af129221e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.935521 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67aede02-5199-4ec9-82a1-e6af129221e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.935582 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v678f\" (UniqueName: \"kubernetes.io/projected/67aede02-5199-4ec9-82a1-e6af129221e8-kube-api-access-v678f\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:55 crc kubenswrapper[4841]: I0313 09:33:55.935598 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67aede02-5199-4ec9-82a1-e6af129221e8-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.163782 4841 generic.go:334] "Generic (PLEG): container finished" podID="67aede02-5199-4ec9-82a1-e6af129221e8" containerID="62352a48da89ccc6cfd8c19cb5cf8a889c5db51bdd9feb57940f59c69b9679ad" exitCode=0 Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.163823 4841 generic.go:334] "Generic (PLEG): container finished" podID="67aede02-5199-4ec9-82a1-e6af129221e8" containerID="8c6c924ce9dc6b28d008df6a7182813c4d0857c708000d2b5b355f99bbf443ef" exitCode=143 Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.165021 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.165598 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"67aede02-5199-4ec9-82a1-e6af129221e8","Type":"ContainerDied","Data":"62352a48da89ccc6cfd8c19cb5cf8a889c5db51bdd9feb57940f59c69b9679ad"} Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.165632 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"67aede02-5199-4ec9-82a1-e6af129221e8","Type":"ContainerDied","Data":"8c6c924ce9dc6b28d008df6a7182813c4d0857c708000d2b5b355f99bbf443ef"} Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.165646 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"67aede02-5199-4ec9-82a1-e6af129221e8","Type":"ContainerDied","Data":"5e217f2a3986678acecd06c44e8ba9f98385fe32bcbaf74ed34f9b69e5ca7414"} Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.165664 4841 scope.go:117] "RemoveContainer" containerID="62352a48da89ccc6cfd8c19cb5cf8a889c5db51bdd9feb57940f59c69b9679ad" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.215497 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.226474 4841 scope.go:117] "RemoveContainer" containerID="8c6c924ce9dc6b28d008df6a7182813c4d0857c708000d2b5b355f99bbf443ef" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.240415 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.253318 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:33:56 crc kubenswrapper[4841]: E0313 09:33:56.253761 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67aede02-5199-4ec9-82a1-e6af129221e8" containerName="nova-metadata-log" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.253781 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="67aede02-5199-4ec9-82a1-e6af129221e8" containerName="nova-metadata-log" Mar 13 09:33:56 crc kubenswrapper[4841]: E0313 09:33:56.253807 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67aede02-5199-4ec9-82a1-e6af129221e8" containerName="nova-metadata-metadata" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.253816 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="67aede02-5199-4ec9-82a1-e6af129221e8" containerName="nova-metadata-metadata" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.254083 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="67aede02-5199-4ec9-82a1-e6af129221e8" containerName="nova-metadata-log" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.254112 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="67aede02-5199-4ec9-82a1-e6af129221e8" containerName="nova-metadata-metadata" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.255235 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.256876 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.257853 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.258667 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.267097 4841 scope.go:117] "RemoveContainer" containerID="62352a48da89ccc6cfd8c19cb5cf8a889c5db51bdd9feb57940f59c69b9679ad" Mar 13 09:33:56 crc kubenswrapper[4841]: E0313 09:33:56.267633 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62352a48da89ccc6cfd8c19cb5cf8a889c5db51bdd9feb57940f59c69b9679ad\": container with ID starting with 62352a48da89ccc6cfd8c19cb5cf8a889c5db51bdd9feb57940f59c69b9679ad not found: ID does not exist" containerID="62352a48da89ccc6cfd8c19cb5cf8a889c5db51bdd9feb57940f59c69b9679ad" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.267672 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62352a48da89ccc6cfd8c19cb5cf8a889c5db51bdd9feb57940f59c69b9679ad"} err="failed to get container status \"62352a48da89ccc6cfd8c19cb5cf8a889c5db51bdd9feb57940f59c69b9679ad\": rpc error: code = NotFound desc = could not find container \"62352a48da89ccc6cfd8c19cb5cf8a889c5db51bdd9feb57940f59c69b9679ad\": container with ID starting with 62352a48da89ccc6cfd8c19cb5cf8a889c5db51bdd9feb57940f59c69b9679ad not found: ID does not exist" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.267700 4841 scope.go:117] "RemoveContainer" containerID="8c6c924ce9dc6b28d008df6a7182813c4d0857c708000d2b5b355f99bbf443ef" Mar 13 09:33:56 crc kubenswrapper[4841]: E0313 09:33:56.268013 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c6c924ce9dc6b28d008df6a7182813c4d0857c708000d2b5b355f99bbf443ef\": container with ID starting with 8c6c924ce9dc6b28d008df6a7182813c4d0857c708000d2b5b355f99bbf443ef not found: ID does not exist" containerID="8c6c924ce9dc6b28d008df6a7182813c4d0857c708000d2b5b355f99bbf443ef" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.268057 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6c924ce9dc6b28d008df6a7182813c4d0857c708000d2b5b355f99bbf443ef"} err="failed to get container status \"8c6c924ce9dc6b28d008df6a7182813c4d0857c708000d2b5b355f99bbf443ef\": rpc error: code = NotFound desc = could not find container \"8c6c924ce9dc6b28d008df6a7182813c4d0857c708000d2b5b355f99bbf443ef\": container with ID starting with 8c6c924ce9dc6b28d008df6a7182813c4d0857c708000d2b5b355f99bbf443ef not found: ID does not exist" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.268075 4841 scope.go:117] "RemoveContainer" containerID="62352a48da89ccc6cfd8c19cb5cf8a889c5db51bdd9feb57940f59c69b9679ad" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.268305 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62352a48da89ccc6cfd8c19cb5cf8a889c5db51bdd9feb57940f59c69b9679ad"} err="failed to get container status \"62352a48da89ccc6cfd8c19cb5cf8a889c5db51bdd9feb57940f59c69b9679ad\": rpc error: code = NotFound desc = could not find container \"62352a48da89ccc6cfd8c19cb5cf8a889c5db51bdd9feb57940f59c69b9679ad\": container with ID starting with 62352a48da89ccc6cfd8c19cb5cf8a889c5db51bdd9feb57940f59c69b9679ad not found: ID does not exist" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.268332 4841 scope.go:117] "RemoveContainer" containerID="8c6c924ce9dc6b28d008df6a7182813c4d0857c708000d2b5b355f99bbf443ef" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.268568 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6c924ce9dc6b28d008df6a7182813c4d0857c708000d2b5b355f99bbf443ef"} err="failed to get container status \"8c6c924ce9dc6b28d008df6a7182813c4d0857c708000d2b5b355f99bbf443ef\": rpc error: code = NotFound desc = could not find container \"8c6c924ce9dc6b28d008df6a7182813c4d0857c708000d2b5b355f99bbf443ef\": container with ID starting with 8c6c924ce9dc6b28d008df6a7182813c4d0857c708000d2b5b355f99bbf443ef not found: ID does not exist" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.347309 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd08b297-2046-4795-925f-422fa3f6b492-logs\") pod \"nova-metadata-0\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " pod="openstack/nova-metadata-0" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.347441 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4svd\" (UniqueName: \"kubernetes.io/projected/cd08b297-2046-4795-925f-422fa3f6b492-kube-api-access-n4svd\") pod \"nova-metadata-0\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " pod="openstack/nova-metadata-0" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.347599 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd08b297-2046-4795-925f-422fa3f6b492-config-data\") pod \"nova-metadata-0\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " pod="openstack/nova-metadata-0" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.347654 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd08b297-2046-4795-925f-422fa3f6b492-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " pod="openstack/nova-metadata-0" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.348384 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd08b297-2046-4795-925f-422fa3f6b492-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " pod="openstack/nova-metadata-0" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.451127 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd08b297-2046-4795-925f-422fa3f6b492-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " pod="openstack/nova-metadata-0" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.451680 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd08b297-2046-4795-925f-422fa3f6b492-logs\") pod \"nova-metadata-0\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " pod="openstack/nova-metadata-0" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.451774 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4svd\" (UniqueName: \"kubernetes.io/projected/cd08b297-2046-4795-925f-422fa3f6b492-kube-api-access-n4svd\") pod \"nova-metadata-0\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " pod="openstack/nova-metadata-0" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.451857 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd08b297-2046-4795-925f-422fa3f6b492-config-data\") pod \"nova-metadata-0\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " pod="openstack/nova-metadata-0" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.451893 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd08b297-2046-4795-925f-422fa3f6b492-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " pod="openstack/nova-metadata-0" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.452324 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd08b297-2046-4795-925f-422fa3f6b492-logs\") pod \"nova-metadata-0\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " pod="openstack/nova-metadata-0" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.457455 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd08b297-2046-4795-925f-422fa3f6b492-config-data\") pod \"nova-metadata-0\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " pod="openstack/nova-metadata-0" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.460008 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd08b297-2046-4795-925f-422fa3f6b492-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " pod="openstack/nova-metadata-0" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.462924 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd08b297-2046-4795-925f-422fa3f6b492-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " pod="openstack/nova-metadata-0" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.482164 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4svd\" (UniqueName: \"kubernetes.io/projected/cd08b297-2046-4795-925f-422fa3f6b492-kube-api-access-n4svd\") pod \"nova-metadata-0\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " pod="openstack/nova-metadata-0" Mar 13 09:33:56 crc kubenswrapper[4841]: I0313 09:33:56.605469 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 09:33:57 crc kubenswrapper[4841]: W0313 09:33:57.090220 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd08b297_2046_4795_925f_422fa3f6b492.slice/crio-cc8f6c1e2df47784ca2ca3d579ee807a566f5a7020ed7ff170b3fe1690faf6b2 WatchSource:0}: Error finding container cc8f6c1e2df47784ca2ca3d579ee807a566f5a7020ed7ff170b3fe1690faf6b2: Status 404 returned error can't find the container with id cc8f6c1e2df47784ca2ca3d579ee807a566f5a7020ed7ff170b3fe1690faf6b2 Mar 13 09:33:57 crc kubenswrapper[4841]: I0313 09:33:57.090484 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:33:57 crc kubenswrapper[4841]: I0313 09:33:57.182052 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd08b297-2046-4795-925f-422fa3f6b492","Type":"ContainerStarted","Data":"cc8f6c1e2df47784ca2ca3d579ee807a566f5a7020ed7ff170b3fe1690faf6b2"} Mar 13 09:33:58 crc kubenswrapper[4841]: I0313 09:33:58.042736 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67aede02-5199-4ec9-82a1-e6af129221e8" path="/var/lib/kubelet/pods/67aede02-5199-4ec9-82a1-e6af129221e8/volumes" Mar 13 09:33:58 crc kubenswrapper[4841]: I0313 09:33:58.193559 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd08b297-2046-4795-925f-422fa3f6b492","Type":"ContainerStarted","Data":"0f34a6990ec775414fcc84287fda1133385a48bb803be9347743397a752a3c5d"} Mar 13 09:33:58 crc kubenswrapper[4841]: I0313 09:33:58.193883 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd08b297-2046-4795-925f-422fa3f6b492","Type":"ContainerStarted","Data":"1472c1f347f201c17b522810d02c7684c6aa14e1bec548f5307ab01d67b8bf63"} Mar 13 09:33:58 crc kubenswrapper[4841]: I0313 09:33:58.216332 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.2162979 podStartE2EDuration="2.2162979s" podCreationTimestamp="2026-03-13 09:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:33:58.214809273 +0000 UTC m=+1320.944709474" watchObservedRunningTime="2026-03-13 09:33:58.2162979 +0000 UTC m=+1320.946198101" Mar 13 09:33:59 crc kubenswrapper[4841]: I0313 09:33:59.209819 4841 generic.go:334] "Generic (PLEG): container finished" podID="1b05d61d-3de0-4314-b5f3-07a447bc3465" containerID="35f357a42ea488cde71740138e6febbe553c221a7357e73b3001c8c27032231e" exitCode=0 Mar 13 09:33:59 crc kubenswrapper[4841]: I0313 09:33:59.211211 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rc4xv" event={"ID":"1b05d61d-3de0-4314-b5f3-07a447bc3465","Type":"ContainerDied","Data":"35f357a42ea488cde71740138e6febbe553c221a7357e73b3001c8c27032231e"} Mar 13 09:33:59 crc kubenswrapper[4841]: I0313 09:33:59.588163 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 09:33:59 crc kubenswrapper[4841]: I0313 09:33:59.588543 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 09:33:59 crc kubenswrapper[4841]: I0313 09:33:59.626126 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 09:33:59 crc kubenswrapper[4841]: I0313 09:33:59.626217 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 09:33:59 crc kubenswrapper[4841]: I0313 09:33:59.639352 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 09:33:59 crc kubenswrapper[4841]: I0313 09:33:59.808754 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.133385 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556574-cvgl5"] Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.134834 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556574-cvgl5" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.140466 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.140649 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.140816 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.141508 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.144062 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556574-cvgl5"] Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.224562 4841 generic.go:334] "Generic (PLEG): container finished" podID="e9f4bb2d-0844-4e66-8bc3-623168e07b9d" containerID="fa082b44c04a30b03f11db32a7fe268cae8b649f8398efae03159a38e74fbbc2" exitCode=0 Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.225406 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xrtxl" event={"ID":"e9f4bb2d-0844-4e66-8bc3-623168e07b9d","Type":"ContainerDied","Data":"fa082b44c04a30b03f11db32a7fe268cae8b649f8398efae03159a38e74fbbc2"} Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.242768 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjx6k\" (UniqueName: \"kubernetes.io/projected/acb9502e-cdc5-4a7a-8faa-fe060876b3f2-kube-api-access-zjx6k\") pod \"auto-csr-approver-29556574-cvgl5\" (UID: \"acb9502e-cdc5-4a7a-8faa-fe060876b3f2\") " pod="openshift-infra/auto-csr-approver-29556574-cvgl5" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.264856 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-lqsbq"] Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.265193 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" podUID="9347a285-b6a7-46ba-9d5d-fd204673894b" containerName="dnsmasq-dns" containerID="cri-o://17189d8602436420a037b24e9cfdb7e231161da5c8d119c5b57e17a7ee984189" gracePeriod=10 Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.292513 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.344905 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjx6k\" (UniqueName: \"kubernetes.io/projected/acb9502e-cdc5-4a7a-8faa-fe060876b3f2-kube-api-access-zjx6k\") pod \"auto-csr-approver-29556574-cvgl5\" (UID: \"acb9502e-cdc5-4a7a-8faa-fe060876b3f2\") " pod="openshift-infra/auto-csr-approver-29556574-cvgl5" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.374393 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjx6k\" (UniqueName: \"kubernetes.io/projected/acb9502e-cdc5-4a7a-8faa-fe060876b3f2-kube-api-access-zjx6k\") pod \"auto-csr-approver-29556574-cvgl5\" (UID: \"acb9502e-cdc5-4a7a-8faa-fe060876b3f2\") " pod="openshift-infra/auto-csr-approver-29556574-cvgl5" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.491756 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" podUID="9347a285-b6a7-46ba-9d5d-fd204673894b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.179:5353: connect: connection refused" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.499775 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556574-cvgl5" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.685305 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rc4xv" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.725659 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.726152 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.760902 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.800881 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b05d61d-3de0-4314-b5f3-07a447bc3465-combined-ca-bundle\") pod \"1b05d61d-3de0-4314-b5f3-07a447bc3465\" (UID: \"1b05d61d-3de0-4314-b5f3-07a447bc3465\") " Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.800955 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b05d61d-3de0-4314-b5f3-07a447bc3465-scripts\") pod \"1b05d61d-3de0-4314-b5f3-07a447bc3465\" (UID: \"1b05d61d-3de0-4314-b5f3-07a447bc3465\") " Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.801627 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n8f2\" (UniqueName: \"kubernetes.io/projected/1b05d61d-3de0-4314-b5f3-07a447bc3465-kube-api-access-7n8f2\") pod \"1b05d61d-3de0-4314-b5f3-07a447bc3465\" (UID: \"1b05d61d-3de0-4314-b5f3-07a447bc3465\") " Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.801662 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b05d61d-3de0-4314-b5f3-07a447bc3465-config-data\") pod \"1b05d61d-3de0-4314-b5f3-07a447bc3465\" (UID: \"1b05d61d-3de0-4314-b5f3-07a447bc3465\") " Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.805512 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b05d61d-3de0-4314-b5f3-07a447bc3465-kube-api-access-7n8f2" (OuterVolumeSpecName: "kube-api-access-7n8f2") pod "1b05d61d-3de0-4314-b5f3-07a447bc3465" (UID: "1b05d61d-3de0-4314-b5f3-07a447bc3465"). InnerVolumeSpecName "kube-api-access-7n8f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.807410 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b05d61d-3de0-4314-b5f3-07a447bc3465-scripts" (OuterVolumeSpecName: "scripts") pod "1b05d61d-3de0-4314-b5f3-07a447bc3465" (UID: "1b05d61d-3de0-4314-b5f3-07a447bc3465"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.831046 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b05d61d-3de0-4314-b5f3-07a447bc3465-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b05d61d-3de0-4314-b5f3-07a447bc3465" (UID: "1b05d61d-3de0-4314-b5f3-07a447bc3465"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.839442 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b05d61d-3de0-4314-b5f3-07a447bc3465-config-data" (OuterVolumeSpecName: "config-data") pod "1b05d61d-3de0-4314-b5f3-07a447bc3465" (UID: "1b05d61d-3de0-4314-b5f3-07a447bc3465"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.903181 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-ovsdbserver-sb\") pod \"9347a285-b6a7-46ba-9d5d-fd204673894b\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.903253 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-ovsdbserver-nb\") pod \"9347a285-b6a7-46ba-9d5d-fd204673894b\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.903329 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-config\") pod \"9347a285-b6a7-46ba-9d5d-fd204673894b\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.903354 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bw96\" (UniqueName: \"kubernetes.io/projected/9347a285-b6a7-46ba-9d5d-fd204673894b-kube-api-access-2bw96\") pod \"9347a285-b6a7-46ba-9d5d-fd204673894b\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.903418 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-dns-svc\") pod \"9347a285-b6a7-46ba-9d5d-fd204673894b\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.903481 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-dns-swift-storage-0\") pod \"9347a285-b6a7-46ba-9d5d-fd204673894b\" (UID: \"9347a285-b6a7-46ba-9d5d-fd204673894b\") " Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.905364 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b05d61d-3de0-4314-b5f3-07a447bc3465-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.905389 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b05d61d-3de0-4314-b5f3-07a447bc3465-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.905405 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b05d61d-3de0-4314-b5f3-07a447bc3465-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.905416 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n8f2\" (UniqueName: \"kubernetes.io/projected/1b05d61d-3de0-4314-b5f3-07a447bc3465-kube-api-access-7n8f2\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.910574 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9347a285-b6a7-46ba-9d5d-fd204673894b-kube-api-access-2bw96" (OuterVolumeSpecName: "kube-api-access-2bw96") pod "9347a285-b6a7-46ba-9d5d-fd204673894b" (UID: "9347a285-b6a7-46ba-9d5d-fd204673894b"). InnerVolumeSpecName "kube-api-access-2bw96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.952504 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9347a285-b6a7-46ba-9d5d-fd204673894b" (UID: "9347a285-b6a7-46ba-9d5d-fd204673894b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.955159 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9347a285-b6a7-46ba-9d5d-fd204673894b" (UID: "9347a285-b6a7-46ba-9d5d-fd204673894b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.956679 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9347a285-b6a7-46ba-9d5d-fd204673894b" (UID: "9347a285-b6a7-46ba-9d5d-fd204673894b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.959030 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-config" (OuterVolumeSpecName: "config") pod "9347a285-b6a7-46ba-9d5d-fd204673894b" (UID: "9347a285-b6a7-46ba-9d5d-fd204673894b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:34:00 crc kubenswrapper[4841]: I0313 09:34:00.980525 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9347a285-b6a7-46ba-9d5d-fd204673894b" (UID: "9347a285-b6a7-46ba-9d5d-fd204673894b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.007166 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.007199 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.007210 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.007219 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.007231 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9347a285-b6a7-46ba-9d5d-fd204673894b-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.007240 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bw96\" (UniqueName: \"kubernetes.io/projected/9347a285-b6a7-46ba-9d5d-fd204673894b-kube-api-access-2bw96\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:01 crc kubenswrapper[4841]: W0313 09:34:01.008041 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacb9502e_cdc5_4a7a_8faa_fe060876b3f2.slice/crio-40bf61b7cf64ea2c68306976034bfbcfbd98ab1010af04f017ad77afd9f9d4d2 WatchSource:0}: Error finding container 40bf61b7cf64ea2c68306976034bfbcfbd98ab1010af04f017ad77afd9f9d4d2: Status 404 returned error can't find the container with id 40bf61b7cf64ea2c68306976034bfbcfbd98ab1010af04f017ad77afd9f9d4d2 Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.013612 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556574-cvgl5"] Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.241137 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rc4xv" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.241158 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rc4xv" event={"ID":"1b05d61d-3de0-4314-b5f3-07a447bc3465","Type":"ContainerDied","Data":"0e0ba135ff1ef5ad1aa737bd5b92357956e3c34207f596c39b9db966ca1867e5"} Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.241604 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e0ba135ff1ef5ad1aa737bd5b92357956e3c34207f596c39b9db966ca1867e5" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.244414 4841 generic.go:334] "Generic (PLEG): container finished" podID="9347a285-b6a7-46ba-9d5d-fd204673894b" containerID="17189d8602436420a037b24e9cfdb7e231161da5c8d119c5b57e17a7ee984189" exitCode=0 Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.244484 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.244539 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" event={"ID":"9347a285-b6a7-46ba-9d5d-fd204673894b","Type":"ContainerDied","Data":"17189d8602436420a037b24e9cfdb7e231161da5c8d119c5b57e17a7ee984189"} Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.244573 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-lqsbq" event={"ID":"9347a285-b6a7-46ba-9d5d-fd204673894b","Type":"ContainerDied","Data":"bf295b20a125ce70fedb3c6f2e7aba0b1f16b6d67cc2841462f7b42b1af412da"} Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.244594 4841 scope.go:117] "RemoveContainer" containerID="17189d8602436420a037b24e9cfdb7e231161da5c8d119c5b57e17a7ee984189" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.258062 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556574-cvgl5" event={"ID":"acb9502e-cdc5-4a7a-8faa-fe060876b3f2","Type":"ContainerStarted","Data":"40bf61b7cf64ea2c68306976034bfbcfbd98ab1010af04f017ad77afd9f9d4d2"} Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.302872 4841 scope.go:117] "RemoveContainer" containerID="e18228d36d65e70c1741a2fb877f22a1b30b678d3244bf87556dbd8fdce02e3b" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.306412 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-lqsbq"] Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.318052 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-lqsbq"] Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.331044 4841 scope.go:117] "RemoveContainer" containerID="17189d8602436420a037b24e9cfdb7e231161da5c8d119c5b57e17a7ee984189" Mar 13 09:34:01 crc kubenswrapper[4841]: E0313 09:34:01.331871 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17189d8602436420a037b24e9cfdb7e231161da5c8d119c5b57e17a7ee984189\": container with ID starting with 17189d8602436420a037b24e9cfdb7e231161da5c8d119c5b57e17a7ee984189 not found: ID does not exist" containerID="17189d8602436420a037b24e9cfdb7e231161da5c8d119c5b57e17a7ee984189" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.331905 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17189d8602436420a037b24e9cfdb7e231161da5c8d119c5b57e17a7ee984189"} err="failed to get container status \"17189d8602436420a037b24e9cfdb7e231161da5c8d119c5b57e17a7ee984189\": rpc error: code = NotFound desc = could not find container \"17189d8602436420a037b24e9cfdb7e231161da5c8d119c5b57e17a7ee984189\": container with ID starting with 17189d8602436420a037b24e9cfdb7e231161da5c8d119c5b57e17a7ee984189 not found: ID does not exist" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.331931 4841 scope.go:117] "RemoveContainer" containerID="e18228d36d65e70c1741a2fb877f22a1b30b678d3244bf87556dbd8fdce02e3b" Mar 13 09:34:01 crc kubenswrapper[4841]: E0313 09:34:01.332237 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e18228d36d65e70c1741a2fb877f22a1b30b678d3244bf87556dbd8fdce02e3b\": container with ID starting with e18228d36d65e70c1741a2fb877f22a1b30b678d3244bf87556dbd8fdce02e3b not found: ID does not exist" containerID="e18228d36d65e70c1741a2fb877f22a1b30b678d3244bf87556dbd8fdce02e3b" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.332344 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e18228d36d65e70c1741a2fb877f22a1b30b678d3244bf87556dbd8fdce02e3b"} err="failed to get container status \"e18228d36d65e70c1741a2fb877f22a1b30b678d3244bf87556dbd8fdce02e3b\": rpc error: code = NotFound desc = could not find container \"e18228d36d65e70c1741a2fb877f22a1b30b678d3244bf87556dbd8fdce02e3b\": container with ID starting with e18228d36d65e70c1741a2fb877f22a1b30b678d3244bf87556dbd8fdce02e3b not found: ID does not exist" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.408425 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.408919 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6" containerName="nova-api-api" containerID="cri-o://f705fdde438a54c3dacdf66858c95bdc1688755e39f1bf22626f1c3e8fce7515" gracePeriod=30 Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.409225 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6" containerName="nova-api-log" containerID="cri-o://2a8b0283ff3e9725aef806b4c4fc6c97d933e5c4784cc189b6f3711683df2547" gracePeriod=30 Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.436328 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.543750 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.543952 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cd08b297-2046-4795-925f-422fa3f6b492" containerName="nova-metadata-log" containerID="cri-o://1472c1f347f201c17b522810d02c7684c6aa14e1bec548f5307ab01d67b8bf63" gracePeriod=30 Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.544343 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cd08b297-2046-4795-925f-422fa3f6b492" containerName="nova-metadata-metadata" containerID="cri-o://0f34a6990ec775414fcc84287fda1133385a48bb803be9347743397a752a3c5d" gracePeriod=30 Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.606139 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.606427 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.710039 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xrtxl" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.845446 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-combined-ca-bundle\") pod \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\" (UID: \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\") " Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.845620 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-scripts\") pod \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\" (UID: \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\") " Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.845724 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-config-data\") pod \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\" (UID: \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\") " Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.845775 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6ktf\" (UniqueName: \"kubernetes.io/projected/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-kube-api-access-g6ktf\") pod \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\" (UID: \"e9f4bb2d-0844-4e66-8bc3-623168e07b9d\") " Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.852611 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-kube-api-access-g6ktf" (OuterVolumeSpecName: "kube-api-access-g6ktf") pod "e9f4bb2d-0844-4e66-8bc3-623168e07b9d" (UID: "e9f4bb2d-0844-4e66-8bc3-623168e07b9d"). InnerVolumeSpecName "kube-api-access-g6ktf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.859396 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-scripts" (OuterVolumeSpecName: "scripts") pod "e9f4bb2d-0844-4e66-8bc3-623168e07b9d" (UID: "e9f4bb2d-0844-4e66-8bc3-623168e07b9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.883060 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-config-data" (OuterVolumeSpecName: "config-data") pod "e9f4bb2d-0844-4e66-8bc3-623168e07b9d" (UID: "e9f4bb2d-0844-4e66-8bc3-623168e07b9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.884688 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9f4bb2d-0844-4e66-8bc3-623168e07b9d" (UID: "e9f4bb2d-0844-4e66-8bc3-623168e07b9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.947584 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.948148 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.948169 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:01 crc kubenswrapper[4841]: I0313 09:34:01.948180 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6ktf\" (UniqueName: \"kubernetes.io/projected/e9f4bb2d-0844-4e66-8bc3-623168e07b9d-kube-api-access-g6ktf\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.011412 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9347a285-b6a7-46ba-9d5d-fd204673894b" path="/var/lib/kubelet/pods/9347a285-b6a7-46ba-9d5d-fd204673894b/volumes" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.071711 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.259458 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd08b297-2046-4795-925f-422fa3f6b492-combined-ca-bundle\") pod \"cd08b297-2046-4795-925f-422fa3f6b492\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.259784 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd08b297-2046-4795-925f-422fa3f6b492-logs\") pod \"cd08b297-2046-4795-925f-422fa3f6b492\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.259915 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd08b297-2046-4795-925f-422fa3f6b492-config-data\") pod \"cd08b297-2046-4795-925f-422fa3f6b492\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.260151 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd08b297-2046-4795-925f-422fa3f6b492-nova-metadata-tls-certs\") pod \"cd08b297-2046-4795-925f-422fa3f6b492\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.260197 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4svd\" (UniqueName: \"kubernetes.io/projected/cd08b297-2046-4795-925f-422fa3f6b492-kube-api-access-n4svd\") pod \"cd08b297-2046-4795-925f-422fa3f6b492\" (UID: \"cd08b297-2046-4795-925f-422fa3f6b492\") " Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.265639 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd08b297-2046-4795-925f-422fa3f6b492-logs" (OuterVolumeSpecName: "logs") pod "cd08b297-2046-4795-925f-422fa3f6b492" (UID: "cd08b297-2046-4795-925f-422fa3f6b492"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.265997 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd08b297-2046-4795-925f-422fa3f6b492-kube-api-access-n4svd" (OuterVolumeSpecName: "kube-api-access-n4svd") pod "cd08b297-2046-4795-925f-422fa3f6b492" (UID: "cd08b297-2046-4795-925f-422fa3f6b492"). InnerVolumeSpecName "kube-api-access-n4svd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.282131 4841 generic.go:334] "Generic (PLEG): container finished" podID="cd08b297-2046-4795-925f-422fa3f6b492" containerID="0f34a6990ec775414fcc84287fda1133385a48bb803be9347743397a752a3c5d" exitCode=0 Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.282161 4841 generic.go:334] "Generic (PLEG): container finished" podID="cd08b297-2046-4795-925f-422fa3f6b492" containerID="1472c1f347f201c17b522810d02c7684c6aa14e1bec548f5307ab01d67b8bf63" exitCode=143 Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.282217 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd08b297-2046-4795-925f-422fa3f6b492","Type":"ContainerDied","Data":"0f34a6990ec775414fcc84287fda1133385a48bb803be9347743397a752a3c5d"} Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.282242 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd08b297-2046-4795-925f-422fa3f6b492","Type":"ContainerDied","Data":"1472c1f347f201c17b522810d02c7684c6aa14e1bec548f5307ab01d67b8bf63"} Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.282294 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd08b297-2046-4795-925f-422fa3f6b492","Type":"ContainerDied","Data":"cc8f6c1e2df47784ca2ca3d579ee807a566f5a7020ed7ff170b3fe1690faf6b2"} Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.282310 4841 scope.go:117] "RemoveContainer" containerID="0f34a6990ec775414fcc84287fda1133385a48bb803be9347743397a752a3c5d" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.282424 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.295562 4841 generic.go:334] "Generic (PLEG): container finished" podID="d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6" containerID="2a8b0283ff3e9725aef806b4c4fc6c97d933e5c4784cc189b6f3711683df2547" exitCode=143 Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.295667 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6","Type":"ContainerDied","Data":"2a8b0283ff3e9725aef806b4c4fc6c97d933e5c4784cc189b6f3711683df2547"} Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.300043 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xrtxl" event={"ID":"e9f4bb2d-0844-4e66-8bc3-623168e07b9d","Type":"ContainerDied","Data":"852c8a72fdeb31e758d79426438b175342827c9382bed4e7ad41fe51589f97a4"} Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.300081 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="852c8a72fdeb31e758d79426438b175342827c9382bed4e7ad41fe51589f97a4" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.300155 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xrtxl" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.310059 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd08b297-2046-4795-925f-422fa3f6b492-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd08b297-2046-4795-925f-422fa3f6b492" (UID: "cd08b297-2046-4795-925f-422fa3f6b492"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.332327 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="867ddec1-6a02-4749-8cf0-259eff17fbd5" containerName="nova-scheduler-scheduler" containerID="cri-o://9f996c7530cc13eb53c38dc0da8ea520c6125b627b48a211c6a5151a8d2123e8" gracePeriod=30 Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.332625 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556574-cvgl5" event={"ID":"acb9502e-cdc5-4a7a-8faa-fe060876b3f2","Type":"ContainerStarted","Data":"bd33c110c3f9a3d4d4a1f5614a7a700d7b5e2733c8d70d29dd581468d7573150"} Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.333749 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd08b297-2046-4795-925f-422fa3f6b492-config-data" (OuterVolumeSpecName: "config-data") pod "cd08b297-2046-4795-925f-422fa3f6b492" (UID: "cd08b297-2046-4795-925f-422fa3f6b492"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.343941 4841 scope.go:117] "RemoveContainer" containerID="1472c1f347f201c17b522810d02c7684c6aa14e1bec548f5307ab01d67b8bf63" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.351727 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd08b297-2046-4795-925f-422fa3f6b492-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "cd08b297-2046-4795-925f-422fa3f6b492" (UID: "cd08b297-2046-4795-925f-422fa3f6b492"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.362402 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 09:34:02 crc kubenswrapper[4841]: E0313 09:34:02.362817 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9347a285-b6a7-46ba-9d5d-fd204673894b" containerName="dnsmasq-dns" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.362829 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9347a285-b6a7-46ba-9d5d-fd204673894b" containerName="dnsmasq-dns" Mar 13 09:34:02 crc kubenswrapper[4841]: E0313 09:34:02.362843 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd08b297-2046-4795-925f-422fa3f6b492" containerName="nova-metadata-metadata" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.362851 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd08b297-2046-4795-925f-422fa3f6b492" containerName="nova-metadata-metadata" Mar 13 09:34:02 crc kubenswrapper[4841]: E0313 09:34:02.362868 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f4bb2d-0844-4e66-8bc3-623168e07b9d" containerName="nova-cell1-conductor-db-sync" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.362874 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f4bb2d-0844-4e66-8bc3-623168e07b9d" containerName="nova-cell1-conductor-db-sync" Mar 13 09:34:02 crc kubenswrapper[4841]: E0313 09:34:02.362896 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd08b297-2046-4795-925f-422fa3f6b492" containerName="nova-metadata-log" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.362901 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd08b297-2046-4795-925f-422fa3f6b492" containerName="nova-metadata-log" Mar 13 09:34:02 crc kubenswrapper[4841]: E0313 09:34:02.362914 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b05d61d-3de0-4314-b5f3-07a447bc3465" containerName="nova-manage" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.362920 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b05d61d-3de0-4314-b5f3-07a447bc3465" containerName="nova-manage" Mar 13 09:34:02 crc kubenswrapper[4841]: E0313 09:34:02.362930 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9347a285-b6a7-46ba-9d5d-fd204673894b" containerName="init" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.362936 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9347a285-b6a7-46ba-9d5d-fd204673894b" containerName="init" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.363103 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9f4bb2d-0844-4e66-8bc3-623168e07b9d" containerName="nova-cell1-conductor-db-sync" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.363118 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd08b297-2046-4795-925f-422fa3f6b492" containerName="nova-metadata-log" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.363131 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd08b297-2046-4795-925f-422fa3f6b492" containerName="nova-metadata-metadata" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.363140 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9347a285-b6a7-46ba-9d5d-fd204673894b" containerName="dnsmasq-dns" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.363157 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b05d61d-3de0-4314-b5f3-07a447bc3465" containerName="nova-manage" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.363748 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.365453 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd08b297-2046-4795-925f-422fa3f6b492-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.365468 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd08b297-2046-4795-925f-422fa3f6b492-logs\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.365476 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd08b297-2046-4795-925f-422fa3f6b492-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.365485 4841 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd08b297-2046-4795-925f-422fa3f6b492-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.365495 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4svd\" (UniqueName: \"kubernetes.io/projected/cd08b297-2046-4795-925f-422fa3f6b492-kube-api-access-n4svd\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.368282 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.380606 4841 scope.go:117] "RemoveContainer" containerID="0f34a6990ec775414fcc84287fda1133385a48bb803be9347743397a752a3c5d" Mar 13 09:34:02 crc kubenswrapper[4841]: E0313 09:34:02.382255 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f34a6990ec775414fcc84287fda1133385a48bb803be9347743397a752a3c5d\": container with ID starting with 0f34a6990ec775414fcc84287fda1133385a48bb803be9347743397a752a3c5d not found: ID does not exist" containerID="0f34a6990ec775414fcc84287fda1133385a48bb803be9347743397a752a3c5d" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.382311 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f34a6990ec775414fcc84287fda1133385a48bb803be9347743397a752a3c5d"} err="failed to get container status \"0f34a6990ec775414fcc84287fda1133385a48bb803be9347743397a752a3c5d\": rpc error: code = NotFound desc = could not find container \"0f34a6990ec775414fcc84287fda1133385a48bb803be9347743397a752a3c5d\": container with ID starting with 0f34a6990ec775414fcc84287fda1133385a48bb803be9347743397a752a3c5d not found: ID does not exist" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.382334 4841 scope.go:117] "RemoveContainer" containerID="1472c1f347f201c17b522810d02c7684c6aa14e1bec548f5307ab01d67b8bf63" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.383238 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 09:34:02 crc kubenswrapper[4841]: E0313 09:34:02.387862 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1472c1f347f201c17b522810d02c7684c6aa14e1bec548f5307ab01d67b8bf63\": container with ID starting with 1472c1f347f201c17b522810d02c7684c6aa14e1bec548f5307ab01d67b8bf63 not found: ID does not exist" containerID="1472c1f347f201c17b522810d02c7684c6aa14e1bec548f5307ab01d67b8bf63" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.387899 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1472c1f347f201c17b522810d02c7684c6aa14e1bec548f5307ab01d67b8bf63"} err="failed to get container status \"1472c1f347f201c17b522810d02c7684c6aa14e1bec548f5307ab01d67b8bf63\": rpc error: code = NotFound desc = could not find container \"1472c1f347f201c17b522810d02c7684c6aa14e1bec548f5307ab01d67b8bf63\": container with ID starting with 1472c1f347f201c17b522810d02c7684c6aa14e1bec548f5307ab01d67b8bf63 not found: ID does not exist" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.387927 4841 scope.go:117] "RemoveContainer" containerID="0f34a6990ec775414fcc84287fda1133385a48bb803be9347743397a752a3c5d" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.388403 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556574-cvgl5" podStartSLOduration=1.5736393400000002 podStartE2EDuration="2.388389831s" podCreationTimestamp="2026-03-13 09:34:00 +0000 UTC" firstStartedPulling="2026-03-13 09:34:01.010391787 +0000 UTC m=+1323.740291988" lastFinishedPulling="2026-03-13 09:34:01.825142288 +0000 UTC m=+1324.555042479" observedRunningTime="2026-03-13 09:34:02.36207976 +0000 UTC m=+1325.091979951" watchObservedRunningTime="2026-03-13 09:34:02.388389831 +0000 UTC m=+1325.118290022" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.397798 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f34a6990ec775414fcc84287fda1133385a48bb803be9347743397a752a3c5d"} err="failed to get container status \"0f34a6990ec775414fcc84287fda1133385a48bb803be9347743397a752a3c5d\": rpc error: code = NotFound desc = could not find container \"0f34a6990ec775414fcc84287fda1133385a48bb803be9347743397a752a3c5d\": container with ID starting with 0f34a6990ec775414fcc84287fda1133385a48bb803be9347743397a752a3c5d not found: ID does not exist" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.397837 4841 scope.go:117] "RemoveContainer" containerID="1472c1f347f201c17b522810d02c7684c6aa14e1bec548f5307ab01d67b8bf63" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.401376 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1472c1f347f201c17b522810d02c7684c6aa14e1bec548f5307ab01d67b8bf63"} err="failed to get container status \"1472c1f347f201c17b522810d02c7684c6aa14e1bec548f5307ab01d67b8bf63\": rpc error: code = NotFound desc = could not find container \"1472c1f347f201c17b522810d02c7684c6aa14e1bec548f5307ab01d67b8bf63\": container with ID starting with 1472c1f347f201c17b522810d02c7684c6aa14e1bec548f5307ab01d67b8bf63 not found: ID does not exist" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.467107 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f70f1b2-8e4f-4738-9b35-2a5e75f92988-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8f70f1b2-8e4f-4738-9b35-2a5e75f92988\") " pod="openstack/nova-cell1-conductor-0" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.467410 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f70f1b2-8e4f-4738-9b35-2a5e75f92988-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8f70f1b2-8e4f-4738-9b35-2a5e75f92988\") " pod="openstack/nova-cell1-conductor-0" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.467552 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99zsm\" (UniqueName: \"kubernetes.io/projected/8f70f1b2-8e4f-4738-9b35-2a5e75f92988-kube-api-access-99zsm\") pod \"nova-cell1-conductor-0\" (UID: \"8f70f1b2-8e4f-4738-9b35-2a5e75f92988\") " pod="openstack/nova-cell1-conductor-0" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.569291 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f70f1b2-8e4f-4738-9b35-2a5e75f92988-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8f70f1b2-8e4f-4738-9b35-2a5e75f92988\") " pod="openstack/nova-cell1-conductor-0" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.569340 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f70f1b2-8e4f-4738-9b35-2a5e75f92988-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8f70f1b2-8e4f-4738-9b35-2a5e75f92988\") " pod="openstack/nova-cell1-conductor-0" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.569407 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99zsm\" (UniqueName: \"kubernetes.io/projected/8f70f1b2-8e4f-4738-9b35-2a5e75f92988-kube-api-access-99zsm\") pod \"nova-cell1-conductor-0\" (UID: \"8f70f1b2-8e4f-4738-9b35-2a5e75f92988\") " pod="openstack/nova-cell1-conductor-0" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.573055 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f70f1b2-8e4f-4738-9b35-2a5e75f92988-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8f70f1b2-8e4f-4738-9b35-2a5e75f92988\") " pod="openstack/nova-cell1-conductor-0" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.573311 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f70f1b2-8e4f-4738-9b35-2a5e75f92988-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8f70f1b2-8e4f-4738-9b35-2a5e75f92988\") " pod="openstack/nova-cell1-conductor-0" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.584055 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99zsm\" (UniqueName: \"kubernetes.io/projected/8f70f1b2-8e4f-4738-9b35-2a5e75f92988-kube-api-access-99zsm\") pod \"nova-cell1-conductor-0\" (UID: \"8f70f1b2-8e4f-4738-9b35-2a5e75f92988\") " pod="openstack/nova-cell1-conductor-0" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.751191 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.756385 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.771105 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.783302 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.784819 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.786841 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.793549 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.819112 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.975958 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-config-data\") pod \"nova-metadata-0\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " pod="openstack/nova-metadata-0" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.976358 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " pod="openstack/nova-metadata-0" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.976381 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-logs\") pod \"nova-metadata-0\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " pod="openstack/nova-metadata-0" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.976414 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkmwb\" (UniqueName: \"kubernetes.io/projected/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-kube-api-access-zkmwb\") pod \"nova-metadata-0\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " pod="openstack/nova-metadata-0" Mar 13 09:34:02 crc kubenswrapper[4841]: I0313 09:34:02.976443 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " pod="openstack/nova-metadata-0" Mar 13 09:34:03 crc kubenswrapper[4841]: I0313 09:34:03.081374 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-config-data\") pod \"nova-metadata-0\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " pod="openstack/nova-metadata-0" Mar 13 09:34:03 crc kubenswrapper[4841]: I0313 09:34:03.081571 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " pod="openstack/nova-metadata-0" Mar 13 09:34:03 crc kubenswrapper[4841]: I0313 09:34:03.081595 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-logs\") pod \"nova-metadata-0\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " pod="openstack/nova-metadata-0" Mar 13 09:34:03 crc kubenswrapper[4841]: I0313 09:34:03.081653 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkmwb\" (UniqueName: \"kubernetes.io/projected/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-kube-api-access-zkmwb\") pod \"nova-metadata-0\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " pod="openstack/nova-metadata-0" Mar 13 09:34:03 crc kubenswrapper[4841]: I0313 09:34:03.081689 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " pod="openstack/nova-metadata-0" Mar 13 09:34:03 crc kubenswrapper[4841]: I0313 09:34:03.082504 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-logs\") pod \"nova-metadata-0\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " pod="openstack/nova-metadata-0" Mar 13 09:34:03 crc kubenswrapper[4841]: I0313 09:34:03.089507 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " pod="openstack/nova-metadata-0" Mar 13 09:34:03 crc kubenswrapper[4841]: I0313 09:34:03.089952 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " pod="openstack/nova-metadata-0" Mar 13 09:34:03 crc kubenswrapper[4841]: I0313 09:34:03.091089 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-config-data\") pod \"nova-metadata-0\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " pod="openstack/nova-metadata-0" Mar 13 09:34:03 crc kubenswrapper[4841]: I0313 09:34:03.104421 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkmwb\" (UniqueName: \"kubernetes.io/projected/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-kube-api-access-zkmwb\") pod \"nova-metadata-0\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " pod="openstack/nova-metadata-0" Mar 13 09:34:03 crc kubenswrapper[4841]: I0313 09:34:03.195816 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 09:34:03 crc kubenswrapper[4841]: I0313 09:34:03.229777 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 09:34:03 crc kubenswrapper[4841]: W0313 09:34:03.234825 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f70f1b2_8e4f_4738_9b35_2a5e75f92988.slice/crio-df9f41a88681f399177afce2adb39370e82e5974407dca864367a2d0ffd74e37 WatchSource:0}: Error finding container df9f41a88681f399177afce2adb39370e82e5974407dca864367a2d0ffd74e37: Status 404 returned error can't find the container with id df9f41a88681f399177afce2adb39370e82e5974407dca864367a2d0ffd74e37 Mar 13 09:34:03 crc kubenswrapper[4841]: I0313 09:34:03.346653 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8f70f1b2-8e4f-4738-9b35-2a5e75f92988","Type":"ContainerStarted","Data":"df9f41a88681f399177afce2adb39370e82e5974407dca864367a2d0ffd74e37"} Mar 13 09:34:03 crc kubenswrapper[4841]: I0313 09:34:03.349849 4841 generic.go:334] "Generic (PLEG): container finished" podID="acb9502e-cdc5-4a7a-8faa-fe060876b3f2" containerID="bd33c110c3f9a3d4d4a1f5614a7a700d7b5e2733c8d70d29dd581468d7573150" exitCode=0 Mar 13 09:34:03 crc kubenswrapper[4841]: I0313 09:34:03.349892 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556574-cvgl5" event={"ID":"acb9502e-cdc5-4a7a-8faa-fe060876b3f2","Type":"ContainerDied","Data":"bd33c110c3f9a3d4d4a1f5614a7a700d7b5e2733c8d70d29dd581468d7573150"} Mar 13 09:34:03 crc kubenswrapper[4841]: I0313 09:34:03.663363 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:34:03 crc kubenswrapper[4841]: W0313 09:34:03.664917 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cf6e9ec_e2c2_4bbf_bb4b_8e905c964093.slice/crio-6053b95fd0de2d3d9325fce52f3e6d9daac1e411625cadb686f997f6302481dc WatchSource:0}: Error finding container 6053b95fd0de2d3d9325fce52f3e6d9daac1e411625cadb686f997f6302481dc: Status 404 returned error can't find the container with id 6053b95fd0de2d3d9325fce52f3e6d9daac1e411625cadb686f997f6302481dc Mar 13 09:34:04 crc kubenswrapper[4841]: I0313 09:34:04.017493 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd08b297-2046-4795-925f-422fa3f6b492" path="/var/lib/kubelet/pods/cd08b297-2046-4795-925f-422fa3f6b492/volumes" Mar 13 09:34:04 crc kubenswrapper[4841]: I0313 09:34:04.362645 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8f70f1b2-8e4f-4738-9b35-2a5e75f92988","Type":"ContainerStarted","Data":"43bbf96f3535381d0079a376f7bb1d126fddaf5799e0d4d0012aad86d56951d2"} Mar 13 09:34:04 crc kubenswrapper[4841]: I0313 09:34:04.362785 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 13 09:34:04 crc kubenswrapper[4841]: I0313 09:34:04.364908 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093","Type":"ContainerStarted","Data":"393320de975d567c3900e1c4c84401c42d8cb0431be77d46457b7092d50e146d"} Mar 13 09:34:04 crc kubenswrapper[4841]: I0313 09:34:04.364937 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093","Type":"ContainerStarted","Data":"cb85a0d9ddf7885c6c797ff30dba64a75afba066f50fe9270905427d4c30e9fe"} Mar 13 09:34:04 crc kubenswrapper[4841]: I0313 09:34:04.364946 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093","Type":"ContainerStarted","Data":"6053b95fd0de2d3d9325fce52f3e6d9daac1e411625cadb686f997f6302481dc"} Mar 13 09:34:04 crc kubenswrapper[4841]: I0313 09:34:04.408530 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.40850731 podStartE2EDuration="2.40850731s" podCreationTimestamp="2026-03-13 09:34:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:34:04.386547475 +0000 UTC m=+1327.116447666" watchObservedRunningTime="2026-03-13 09:34:04.40850731 +0000 UTC m=+1327.138407501" Mar 13 09:34:05 crc kubenswrapper[4841]: E0313 09:34:05.049551 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9f996c7530cc13eb53c38dc0da8ea520c6125b627b48a211c6a5151a8d2123e8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 09:34:05 crc kubenswrapper[4841]: E0313 09:34:05.066701 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9f996c7530cc13eb53c38dc0da8ea520c6125b627b48a211c6a5151a8d2123e8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 09:34:05 crc kubenswrapper[4841]: E0313 09:34:05.073603 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9f996c7530cc13eb53c38dc0da8ea520c6125b627b48a211c6a5151a8d2123e8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 09:34:05 crc kubenswrapper[4841]: E0313 09:34:05.073657 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="867ddec1-6a02-4749-8cf0-259eff17fbd5" containerName="nova-scheduler-scheduler" Mar 13 09:34:05 crc kubenswrapper[4841]: I0313 09:34:05.505322 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556574-cvgl5" Mar 13 09:34:05 crc kubenswrapper[4841]: I0313 09:34:05.525551 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.5255348619999998 podStartE2EDuration="3.525534862s" podCreationTimestamp="2026-03-13 09:34:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:34:04.413234837 +0000 UTC m=+1327.143135028" watchObservedRunningTime="2026-03-13 09:34:05.525534862 +0000 UTC m=+1328.255435053" Mar 13 09:34:05 crc kubenswrapper[4841]: I0313 09:34:05.642140 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjx6k\" (UniqueName: \"kubernetes.io/projected/acb9502e-cdc5-4a7a-8faa-fe060876b3f2-kube-api-access-zjx6k\") pod \"acb9502e-cdc5-4a7a-8faa-fe060876b3f2\" (UID: \"acb9502e-cdc5-4a7a-8faa-fe060876b3f2\") " Mar 13 09:34:05 crc kubenswrapper[4841]: I0313 09:34:05.648436 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb9502e-cdc5-4a7a-8faa-fe060876b3f2-kube-api-access-zjx6k" (OuterVolumeSpecName: "kube-api-access-zjx6k") pod "acb9502e-cdc5-4a7a-8faa-fe060876b3f2" (UID: "acb9502e-cdc5-4a7a-8faa-fe060876b3f2"). InnerVolumeSpecName "kube-api-access-zjx6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:34:05 crc kubenswrapper[4841]: I0313 09:34:05.744122 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjx6k\" (UniqueName: \"kubernetes.io/projected/acb9502e-cdc5-4a7a-8faa-fe060876b3f2-kube-api-access-zjx6k\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.169639 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.355595 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867ddec1-6a02-4749-8cf0-259eff17fbd5-combined-ca-bundle\") pod \"867ddec1-6a02-4749-8cf0-259eff17fbd5\" (UID: \"867ddec1-6a02-4749-8cf0-259eff17fbd5\") " Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.355673 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlf4t\" (UniqueName: \"kubernetes.io/projected/867ddec1-6a02-4749-8cf0-259eff17fbd5-kube-api-access-nlf4t\") pod \"867ddec1-6a02-4749-8cf0-259eff17fbd5\" (UID: \"867ddec1-6a02-4749-8cf0-259eff17fbd5\") " Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.355798 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/867ddec1-6a02-4749-8cf0-259eff17fbd5-config-data\") pod \"867ddec1-6a02-4749-8cf0-259eff17fbd5\" (UID: \"867ddec1-6a02-4749-8cf0-259eff17fbd5\") " Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.359658 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/867ddec1-6a02-4749-8cf0-259eff17fbd5-kube-api-access-nlf4t" (OuterVolumeSpecName: "kube-api-access-nlf4t") pod "867ddec1-6a02-4749-8cf0-259eff17fbd5" (UID: "867ddec1-6a02-4749-8cf0-259eff17fbd5"). InnerVolumeSpecName "kube-api-access-nlf4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.382044 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/867ddec1-6a02-4749-8cf0-259eff17fbd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "867ddec1-6a02-4749-8cf0-259eff17fbd5" (UID: "867ddec1-6a02-4749-8cf0-259eff17fbd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.384694 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556574-cvgl5" event={"ID":"acb9502e-cdc5-4a7a-8faa-fe060876b3f2","Type":"ContainerDied","Data":"40bf61b7cf64ea2c68306976034bfbcfbd98ab1010af04f017ad77afd9f9d4d2"} Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.384818 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40bf61b7cf64ea2c68306976034bfbcfbd98ab1010af04f017ad77afd9f9d4d2" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.384868 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556574-cvgl5" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.386225 4841 generic.go:334] "Generic (PLEG): container finished" podID="867ddec1-6a02-4749-8cf0-259eff17fbd5" containerID="9f996c7530cc13eb53c38dc0da8ea520c6125b627b48a211c6a5151a8d2123e8" exitCode=0 Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.386354 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"867ddec1-6a02-4749-8cf0-259eff17fbd5","Type":"ContainerDied","Data":"9f996c7530cc13eb53c38dc0da8ea520c6125b627b48a211c6a5151a8d2123e8"} Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.386844 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"867ddec1-6a02-4749-8cf0-259eff17fbd5","Type":"ContainerDied","Data":"df8f769d093c2d3b3078340cd11be2c218b12da6827d73b78940454c0ca0c52d"} Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.386803 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.386978 4841 scope.go:117] "RemoveContainer" containerID="9f996c7530cc13eb53c38dc0da8ea520c6125b627b48a211c6a5151a8d2123e8" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.391058 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/867ddec1-6a02-4749-8cf0-259eff17fbd5-config-data" (OuterVolumeSpecName: "config-data") pod "867ddec1-6a02-4749-8cf0-259eff17fbd5" (UID: "867ddec1-6a02-4749-8cf0-259eff17fbd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.458482 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867ddec1-6a02-4749-8cf0-259eff17fbd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.458513 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlf4t\" (UniqueName: \"kubernetes.io/projected/867ddec1-6a02-4749-8cf0-259eff17fbd5-kube-api-access-nlf4t\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.458525 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/867ddec1-6a02-4749-8cf0-259eff17fbd5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.508432 4841 scope.go:117] "RemoveContainer" containerID="9f996c7530cc13eb53c38dc0da8ea520c6125b627b48a211c6a5151a8d2123e8" Mar 13 09:34:06 crc kubenswrapper[4841]: E0313 09:34:06.508787 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f996c7530cc13eb53c38dc0da8ea520c6125b627b48a211c6a5151a8d2123e8\": container with ID starting with 9f996c7530cc13eb53c38dc0da8ea520c6125b627b48a211c6a5151a8d2123e8 not found: ID does not exist" containerID="9f996c7530cc13eb53c38dc0da8ea520c6125b627b48a211c6a5151a8d2123e8" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.508813 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f996c7530cc13eb53c38dc0da8ea520c6125b627b48a211c6a5151a8d2123e8"} err="failed to get container status \"9f996c7530cc13eb53c38dc0da8ea520c6125b627b48a211c6a5151a8d2123e8\": rpc error: code = NotFound desc = could not find container \"9f996c7530cc13eb53c38dc0da8ea520c6125b627b48a211c6a5151a8d2123e8\": container with ID starting with 9f996c7530cc13eb53c38dc0da8ea520c6125b627b48a211c6a5151a8d2123e8 not found: ID does not exist" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.577458 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556568-9n5mn"] Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.587388 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556568-9n5mn"] Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.717166 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.727099 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.739497 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 09:34:06 crc kubenswrapper[4841]: E0313 09:34:06.740242 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="867ddec1-6a02-4749-8cf0-259eff17fbd5" containerName="nova-scheduler-scheduler" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.740359 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="867ddec1-6a02-4749-8cf0-259eff17fbd5" containerName="nova-scheduler-scheduler" Mar 13 09:34:06 crc kubenswrapper[4841]: E0313 09:34:06.740416 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb9502e-cdc5-4a7a-8faa-fe060876b3f2" containerName="oc" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.740433 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb9502e-cdc5-4a7a-8faa-fe060876b3f2" containerName="oc" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.740800 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb9502e-cdc5-4a7a-8faa-fe060876b3f2" containerName="oc" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.740864 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="867ddec1-6a02-4749-8cf0-259eff17fbd5" containerName="nova-scheduler-scheduler" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.741890 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.744346 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.754626 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.867168 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/903d509d-35b9-43f6-9e55-0da3e0d628cb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"903d509d-35b9-43f6-9e55-0da3e0d628cb\") " pod="openstack/nova-scheduler-0" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.867343 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwx7f\" (UniqueName: \"kubernetes.io/projected/903d509d-35b9-43f6-9e55-0da3e0d628cb-kube-api-access-hwx7f\") pod \"nova-scheduler-0\" (UID: \"903d509d-35b9-43f6-9e55-0da3e0d628cb\") " pod="openstack/nova-scheduler-0" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.867391 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/903d509d-35b9-43f6-9e55-0da3e0d628cb-config-data\") pod \"nova-scheduler-0\" (UID: \"903d509d-35b9-43f6-9e55-0da3e0d628cb\") " pod="openstack/nova-scheduler-0" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.968757 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/903d509d-35b9-43f6-9e55-0da3e0d628cb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"903d509d-35b9-43f6-9e55-0da3e0d628cb\") " pod="openstack/nova-scheduler-0" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.968922 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwx7f\" (UniqueName: \"kubernetes.io/projected/903d509d-35b9-43f6-9e55-0da3e0d628cb-kube-api-access-hwx7f\") pod \"nova-scheduler-0\" (UID: \"903d509d-35b9-43f6-9e55-0da3e0d628cb\") " pod="openstack/nova-scheduler-0" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.968974 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/903d509d-35b9-43f6-9e55-0da3e0d628cb-config-data\") pod \"nova-scheduler-0\" (UID: \"903d509d-35b9-43f6-9e55-0da3e0d628cb\") " pod="openstack/nova-scheduler-0" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.974218 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/903d509d-35b9-43f6-9e55-0da3e0d628cb-config-data\") pod \"nova-scheduler-0\" (UID: \"903d509d-35b9-43f6-9e55-0da3e0d628cb\") " pod="openstack/nova-scheduler-0" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.984861 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/903d509d-35b9-43f6-9e55-0da3e0d628cb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"903d509d-35b9-43f6-9e55-0da3e0d628cb\") " pod="openstack/nova-scheduler-0" Mar 13 09:34:06 crc kubenswrapper[4841]: I0313 09:34:06.987638 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwx7f\" (UniqueName: \"kubernetes.io/projected/903d509d-35b9-43f6-9e55-0da3e0d628cb-kube-api-access-hwx7f\") pod \"nova-scheduler-0\" (UID: \"903d509d-35b9-43f6-9e55-0da3e0d628cb\") " pod="openstack/nova-scheduler-0" Mar 13 09:34:07 crc kubenswrapper[4841]: E0313 09:34:07.020579 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7bb7ca8_61b5_429f_ae75_9fbcd13ff8c6.slice/crio-conmon-f705fdde438a54c3dacdf66858c95bdc1688755e39f1bf22626f1c3e8fce7515.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7bb7ca8_61b5_429f_ae75_9fbcd13ff8c6.slice/crio-f705fdde438a54c3dacdf66858c95bdc1688755e39f1bf22626f1c3e8fce7515.scope\": RecentStats: unable to find data in memory cache]" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.065828 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.161441 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.273401 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-config-data\") pod \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\" (UID: \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\") " Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.273460 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2wlk\" (UniqueName: \"kubernetes.io/projected/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-kube-api-access-p2wlk\") pod \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\" (UID: \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\") " Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.273539 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-combined-ca-bundle\") pod \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\" (UID: \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\") " Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.273589 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-logs\") pod \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\" (UID: \"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6\") " Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.274225 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-logs" (OuterVolumeSpecName: "logs") pod "d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6" (UID: "d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.286567 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-kube-api-access-p2wlk" (OuterVolumeSpecName: "kube-api-access-p2wlk") pod "d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6" (UID: "d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6"). InnerVolumeSpecName "kube-api-access-p2wlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.306107 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6" (UID: "d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.309039 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-config-data" (OuterVolumeSpecName: "config-data") pod "d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6" (UID: "d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.375970 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.376016 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2wlk\" (UniqueName: \"kubernetes.io/projected/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-kube-api-access-p2wlk\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.376051 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.376063 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6-logs\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.398326 4841 generic.go:334] "Generic (PLEG): container finished" podID="d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6" containerID="f705fdde438a54c3dacdf66858c95bdc1688755e39f1bf22626f1c3e8fce7515" exitCode=0 Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.398371 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6","Type":"ContainerDied","Data":"f705fdde438a54c3dacdf66858c95bdc1688755e39f1bf22626f1c3e8fce7515"} Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.398400 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6","Type":"ContainerDied","Data":"ac496c414a94831c32345ae628dd0af1278b63529b13a9ee62120a32e396e75f"} Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.398420 4841 scope.go:117] "RemoveContainer" containerID="f705fdde438a54c3dacdf66858c95bdc1688755e39f1bf22626f1c3e8fce7515" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.398450 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.465175 4841 scope.go:117] "RemoveContainer" containerID="2a8b0283ff3e9725aef806b4c4fc6c97d933e5c4784cc189b6f3711683df2547" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.478985 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.490369 4841 scope.go:117] "RemoveContainer" containerID="f705fdde438a54c3dacdf66858c95bdc1688755e39f1bf22626f1c3e8fce7515" Mar 13 09:34:07 crc kubenswrapper[4841]: E0313 09:34:07.491454 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f705fdde438a54c3dacdf66858c95bdc1688755e39f1bf22626f1c3e8fce7515\": container with ID starting with f705fdde438a54c3dacdf66858c95bdc1688755e39f1bf22626f1c3e8fce7515 not found: ID does not exist" containerID="f705fdde438a54c3dacdf66858c95bdc1688755e39f1bf22626f1c3e8fce7515" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.491529 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f705fdde438a54c3dacdf66858c95bdc1688755e39f1bf22626f1c3e8fce7515"} err="failed to get container status \"f705fdde438a54c3dacdf66858c95bdc1688755e39f1bf22626f1c3e8fce7515\": rpc error: code = NotFound desc = could not find container \"f705fdde438a54c3dacdf66858c95bdc1688755e39f1bf22626f1c3e8fce7515\": container with ID starting with f705fdde438a54c3dacdf66858c95bdc1688755e39f1bf22626f1c3e8fce7515 not found: ID does not exist" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.491586 4841 scope.go:117] "RemoveContainer" containerID="2a8b0283ff3e9725aef806b4c4fc6c97d933e5c4784cc189b6f3711683df2547" Mar 13 09:34:07 crc kubenswrapper[4841]: E0313 09:34:07.491967 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a8b0283ff3e9725aef806b4c4fc6c97d933e5c4784cc189b6f3711683df2547\": container with ID starting with 2a8b0283ff3e9725aef806b4c4fc6c97d933e5c4784cc189b6f3711683df2547 not found: ID does not exist" containerID="2a8b0283ff3e9725aef806b4c4fc6c97d933e5c4784cc189b6f3711683df2547" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.492005 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a8b0283ff3e9725aef806b4c4fc6c97d933e5c4784cc189b6f3711683df2547"} err="failed to get container status \"2a8b0283ff3e9725aef806b4c4fc6c97d933e5c4784cc189b6f3711683df2547\": rpc error: code = NotFound desc = could not find container \"2a8b0283ff3e9725aef806b4c4fc6c97d933e5c4784cc189b6f3711683df2547\": container with ID starting with 2a8b0283ff3e9725aef806b4c4fc6c97d933e5c4784cc189b6f3711683df2547 not found: ID does not exist" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.496806 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.508948 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 09:34:07 crc kubenswrapper[4841]: E0313 09:34:07.509537 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6" containerName="nova-api-api" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.509558 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6" containerName="nova-api-api" Mar 13 09:34:07 crc kubenswrapper[4841]: E0313 09:34:07.509575 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6" containerName="nova-api-log" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.509583 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6" containerName="nova-api-log" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.509844 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6" containerName="nova-api-log" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.509867 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6" containerName="nova-api-api" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.511091 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.516710 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.523494 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.571676 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.682346 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5c21ae9-3d60-42b9-9ad0-45490e889799-logs\") pod \"nova-api-0\" (UID: \"b5c21ae9-3d60-42b9-9ad0-45490e889799\") " pod="openstack/nova-api-0" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.682396 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5c21ae9-3d60-42b9-9ad0-45490e889799-config-data\") pod \"nova-api-0\" (UID: \"b5c21ae9-3d60-42b9-9ad0-45490e889799\") " pod="openstack/nova-api-0" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.682500 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnptr\" (UniqueName: \"kubernetes.io/projected/b5c21ae9-3d60-42b9-9ad0-45490e889799-kube-api-access-fnptr\") pod \"nova-api-0\" (UID: \"b5c21ae9-3d60-42b9-9ad0-45490e889799\") " pod="openstack/nova-api-0" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.682522 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5c21ae9-3d60-42b9-9ad0-45490e889799-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b5c21ae9-3d60-42b9-9ad0-45490e889799\") " pod="openstack/nova-api-0" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.784076 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnptr\" (UniqueName: \"kubernetes.io/projected/b5c21ae9-3d60-42b9-9ad0-45490e889799-kube-api-access-fnptr\") pod \"nova-api-0\" (UID: \"b5c21ae9-3d60-42b9-9ad0-45490e889799\") " pod="openstack/nova-api-0" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.784126 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5c21ae9-3d60-42b9-9ad0-45490e889799-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b5c21ae9-3d60-42b9-9ad0-45490e889799\") " pod="openstack/nova-api-0" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.784216 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5c21ae9-3d60-42b9-9ad0-45490e889799-logs\") pod \"nova-api-0\" (UID: \"b5c21ae9-3d60-42b9-9ad0-45490e889799\") " pod="openstack/nova-api-0" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.784249 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5c21ae9-3d60-42b9-9ad0-45490e889799-config-data\") pod \"nova-api-0\" (UID: \"b5c21ae9-3d60-42b9-9ad0-45490e889799\") " pod="openstack/nova-api-0" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.785013 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5c21ae9-3d60-42b9-9ad0-45490e889799-logs\") pod \"nova-api-0\" (UID: \"b5c21ae9-3d60-42b9-9ad0-45490e889799\") " pod="openstack/nova-api-0" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.787813 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5c21ae9-3d60-42b9-9ad0-45490e889799-config-data\") pod \"nova-api-0\" (UID: \"b5c21ae9-3d60-42b9-9ad0-45490e889799\") " pod="openstack/nova-api-0" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.790181 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5c21ae9-3d60-42b9-9ad0-45490e889799-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b5c21ae9-3d60-42b9-9ad0-45490e889799\") " pod="openstack/nova-api-0" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.800540 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnptr\" (UniqueName: \"kubernetes.io/projected/b5c21ae9-3d60-42b9-9ad0-45490e889799-kube-api-access-fnptr\") pod \"nova-api-0\" (UID: \"b5c21ae9-3d60-42b9-9ad0-45490e889799\") " pod="openstack/nova-api-0" Mar 13 09:34:07 crc kubenswrapper[4841]: I0313 09:34:07.838744 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 09:34:09 crc kubenswrapper[4841]: I0313 09:34:08.012774 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="867ddec1-6a02-4749-8cf0-259eff17fbd5" path="/var/lib/kubelet/pods/867ddec1-6a02-4749-8cf0-259eff17fbd5/volumes" Mar 13 09:34:09 crc kubenswrapper[4841]: I0313 09:34:08.013710 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aceaa228-acf8-4fad-b348-2f26e2225a80" path="/var/lib/kubelet/pods/aceaa228-acf8-4fad-b348-2f26e2225a80/volumes" Mar 13 09:34:09 crc kubenswrapper[4841]: I0313 09:34:08.014347 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6" path="/var/lib/kubelet/pods/d7bb7ca8-61b5-429f-ae75-9fbcd13ff8c6/volumes" Mar 13 09:34:09 crc kubenswrapper[4841]: I0313 09:34:08.197939 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 09:34:09 crc kubenswrapper[4841]: I0313 09:34:08.198425 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 09:34:09 crc kubenswrapper[4841]: I0313 09:34:08.395561 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 09:34:09 crc kubenswrapper[4841]: I0313 09:34:08.408430 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"903d509d-35b9-43f6-9e55-0da3e0d628cb","Type":"ContainerStarted","Data":"79f12ce641025e7db8a65f88336b638d0da83efa2e8fdd5d01bf8e8d366948dd"} Mar 13 09:34:09 crc kubenswrapper[4841]: I0313 09:34:08.408475 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"903d509d-35b9-43f6-9e55-0da3e0d628cb","Type":"ContainerStarted","Data":"c5a27e4894c976e148727dce446bb05d552aec1a526b6f09d20ffc2a98cef5f0"} Mar 13 09:34:09 crc kubenswrapper[4841]: I0313 09:34:08.494491 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.494473883 podStartE2EDuration="2.494473883s" podCreationTimestamp="2026-03-13 09:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:34:08.4821707 +0000 UTC m=+1331.212070891" watchObservedRunningTime="2026-03-13 09:34:08.494473883 +0000 UTC m=+1331.224374074" Mar 13 09:34:09 crc kubenswrapper[4841]: I0313 09:34:09.370708 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 09:34:09 crc kubenswrapper[4841]: I0313 09:34:09.423209 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5c21ae9-3d60-42b9-9ad0-45490e889799","Type":"ContainerStarted","Data":"c8648a06c9be50812eb3c0ea184dc5021ce529ac0f6338b587cc361f5ac32b48"} Mar 13 09:34:10 crc kubenswrapper[4841]: I0313 09:34:10.432666 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5c21ae9-3d60-42b9-9ad0-45490e889799","Type":"ContainerStarted","Data":"baecf9d54bc0c1abd9cdc73f3fb63e48a0284c58a3d6d8223ee59c70e07a4d89"} Mar 13 09:34:10 crc kubenswrapper[4841]: I0313 09:34:10.433456 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5c21ae9-3d60-42b9-9ad0-45490e889799","Type":"ContainerStarted","Data":"c2fd43fb8aa09b3299822800c11453bdb427179b32c52d3bbd014b9197eb3903"} Mar 13 09:34:10 crc kubenswrapper[4841]: I0313 09:34:10.458493 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.458472242 podStartE2EDuration="3.458472242s" podCreationTimestamp="2026-03-13 09:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:34:10.451116462 +0000 UTC m=+1333.181016663" watchObservedRunningTime="2026-03-13 09:34:10.458472242 +0000 UTC m=+1333.188372433" Mar 13 09:34:11 crc kubenswrapper[4841]: I0313 09:34:11.871498 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 09:34:11 crc kubenswrapper[4841]: I0313 09:34:11.872001 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a7d41010-f9ab-45b0-9d21-e05037f26651" containerName="kube-state-metrics" containerID="cri-o://3cad8f83bc789a196e75f2e23f0bb2a62806a9b90fc469a158b6c4565079b82f" gracePeriod=30 Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.065920 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.330523 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.450850 4841 generic.go:334] "Generic (PLEG): container finished" podID="a7d41010-f9ab-45b0-9d21-e05037f26651" containerID="3cad8f83bc789a196e75f2e23f0bb2a62806a9b90fc469a158b6c4565079b82f" exitCode=2 Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.450887 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a7d41010-f9ab-45b0-9d21-e05037f26651","Type":"ContainerDied","Data":"3cad8f83bc789a196e75f2e23f0bb2a62806a9b90fc469a158b6c4565079b82f"} Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.450911 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a7d41010-f9ab-45b0-9d21-e05037f26651","Type":"ContainerDied","Data":"97e0820920a2cf9844da5b1f3d09daca9fec0705dd1e7ff19cfebbe11fbb2cb0"} Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.450927 4841 scope.go:117] "RemoveContainer" containerID="3cad8f83bc789a196e75f2e23f0bb2a62806a9b90fc469a158b6c4565079b82f" Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.451037 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.477152 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmdb8\" (UniqueName: \"kubernetes.io/projected/a7d41010-f9ab-45b0-9d21-e05037f26651-kube-api-access-xmdb8\") pod \"a7d41010-f9ab-45b0-9d21-e05037f26651\" (UID: \"a7d41010-f9ab-45b0-9d21-e05037f26651\") " Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.478062 4841 scope.go:117] "RemoveContainer" containerID="3cad8f83bc789a196e75f2e23f0bb2a62806a9b90fc469a158b6c4565079b82f" Mar 13 09:34:12 crc kubenswrapper[4841]: E0313 09:34:12.478908 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cad8f83bc789a196e75f2e23f0bb2a62806a9b90fc469a158b6c4565079b82f\": container with ID starting with 3cad8f83bc789a196e75f2e23f0bb2a62806a9b90fc469a158b6c4565079b82f not found: ID does not exist" containerID="3cad8f83bc789a196e75f2e23f0bb2a62806a9b90fc469a158b6c4565079b82f" Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.478956 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cad8f83bc789a196e75f2e23f0bb2a62806a9b90fc469a158b6c4565079b82f"} err="failed to get container status \"3cad8f83bc789a196e75f2e23f0bb2a62806a9b90fc469a158b6c4565079b82f\": rpc error: code = NotFound desc = could not find container \"3cad8f83bc789a196e75f2e23f0bb2a62806a9b90fc469a158b6c4565079b82f\": container with ID starting with 3cad8f83bc789a196e75f2e23f0bb2a62806a9b90fc469a158b6c4565079b82f not found: ID does not exist" Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.492561 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d41010-f9ab-45b0-9d21-e05037f26651-kube-api-access-xmdb8" (OuterVolumeSpecName: "kube-api-access-xmdb8") pod "a7d41010-f9ab-45b0-9d21-e05037f26651" (UID: "a7d41010-f9ab-45b0-9d21-e05037f26651"). InnerVolumeSpecName "kube-api-access-xmdb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.579530 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmdb8\" (UniqueName: \"kubernetes.io/projected/a7d41010-f9ab-45b0-9d21-e05037f26651-kube-api-access-xmdb8\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.791963 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.793737 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.802459 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.831870 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 09:34:12 crc kubenswrapper[4841]: E0313 09:34:12.832316 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d41010-f9ab-45b0-9d21-e05037f26651" containerName="kube-state-metrics" Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.832331 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d41010-f9ab-45b0-9d21-e05037f26651" containerName="kube-state-metrics" Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.832519 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d41010-f9ab-45b0-9d21-e05037f26651" containerName="kube-state-metrics" Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.833131 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.837000 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.837789 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.841430 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.988155 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8b082eb0-dc81-49f6-a313-07507e296c71-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8b082eb0-dc81-49f6-a313-07507e296c71\") " pod="openstack/kube-state-metrics-0" Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.989295 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkgkc\" (UniqueName: \"kubernetes.io/projected/8b082eb0-dc81-49f6-a313-07507e296c71-kube-api-access-rkgkc\") pod \"kube-state-metrics-0\" (UID: \"8b082eb0-dc81-49f6-a313-07507e296c71\") " pod="openstack/kube-state-metrics-0" Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.989463 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b082eb0-dc81-49f6-a313-07507e296c71-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8b082eb0-dc81-49f6-a313-07507e296c71\") " pod="openstack/kube-state-metrics-0" Mar 13 09:34:12 crc kubenswrapper[4841]: I0313 09:34:12.989663 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b082eb0-dc81-49f6-a313-07507e296c71-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8b082eb0-dc81-49f6-a313-07507e296c71\") " pod="openstack/kube-state-metrics-0" Mar 13 09:34:13 crc kubenswrapper[4841]: I0313 09:34:13.091574 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkgkc\" (UniqueName: \"kubernetes.io/projected/8b082eb0-dc81-49f6-a313-07507e296c71-kube-api-access-rkgkc\") pod \"kube-state-metrics-0\" (UID: \"8b082eb0-dc81-49f6-a313-07507e296c71\") " pod="openstack/kube-state-metrics-0" Mar 13 09:34:13 crc kubenswrapper[4841]: I0313 09:34:13.091657 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b082eb0-dc81-49f6-a313-07507e296c71-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8b082eb0-dc81-49f6-a313-07507e296c71\") " pod="openstack/kube-state-metrics-0" Mar 13 09:34:13 crc kubenswrapper[4841]: I0313 09:34:13.091740 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b082eb0-dc81-49f6-a313-07507e296c71-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8b082eb0-dc81-49f6-a313-07507e296c71\") " pod="openstack/kube-state-metrics-0" Mar 13 09:34:13 crc kubenswrapper[4841]: I0313 09:34:13.091802 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8b082eb0-dc81-49f6-a313-07507e296c71-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8b082eb0-dc81-49f6-a313-07507e296c71\") " pod="openstack/kube-state-metrics-0" Mar 13 09:34:13 crc kubenswrapper[4841]: I0313 09:34:13.096468 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b082eb0-dc81-49f6-a313-07507e296c71-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8b082eb0-dc81-49f6-a313-07507e296c71\") " pod="openstack/kube-state-metrics-0" Mar 13 09:34:13 crc kubenswrapper[4841]: I0313 09:34:13.096749 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b082eb0-dc81-49f6-a313-07507e296c71-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8b082eb0-dc81-49f6-a313-07507e296c71\") " pod="openstack/kube-state-metrics-0" Mar 13 09:34:13 crc kubenswrapper[4841]: I0313 09:34:13.097322 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8b082eb0-dc81-49f6-a313-07507e296c71-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8b082eb0-dc81-49f6-a313-07507e296c71\") " pod="openstack/kube-state-metrics-0" Mar 13 09:34:13 crc kubenswrapper[4841]: I0313 09:34:13.111424 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkgkc\" (UniqueName: \"kubernetes.io/projected/8b082eb0-dc81-49f6-a313-07507e296c71-kube-api-access-rkgkc\") pod \"kube-state-metrics-0\" (UID: \"8b082eb0-dc81-49f6-a313-07507e296c71\") " pod="openstack/kube-state-metrics-0" Mar 13 09:34:13 crc kubenswrapper[4841]: I0313 09:34:13.196047 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 09:34:13 crc kubenswrapper[4841]: I0313 09:34:13.196080 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 09:34:13 crc kubenswrapper[4841]: I0313 09:34:13.207252 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 09:34:13 crc kubenswrapper[4841]: I0313 09:34:13.680192 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 09:34:13 crc kubenswrapper[4841]: W0313 09:34:13.681470 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b082eb0_dc81_49f6_a313_07507e296c71.slice/crio-a07d914395a0ad1c707afd3877e50356f04aa747a1049a57f36d597b77235d07 WatchSource:0}: Error finding container a07d914395a0ad1c707afd3877e50356f04aa747a1049a57f36d597b77235d07: Status 404 returned error can't find the container with id a07d914395a0ad1c707afd3877e50356f04aa747a1049a57f36d597b77235d07 Mar 13 09:34:13 crc kubenswrapper[4841]: I0313 09:34:13.785313 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:34:13 crc kubenswrapper[4841]: I0313 09:34:13.785571 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerName="ceilometer-central-agent" containerID="cri-o://9bce5e1595158aa284fda9a35039119d9d367974e7f318e860bd9bbc3d94c1b0" gracePeriod=30 Mar 13 09:34:13 crc kubenswrapper[4841]: I0313 09:34:13.785683 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerName="sg-core" containerID="cri-o://928d5bedbb474d3290d23c97d2a184507b4b71f55edf64936973c86202a9d5a8" gracePeriod=30 Mar 13 09:34:13 crc kubenswrapper[4841]: I0313 09:34:13.785641 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerName="proxy-httpd" containerID="cri-o://e9ec10c9fb528c6843eca05c975eccb635bba35d0ea2b36da5197939ca9d0f98" gracePeriod=30 Mar 13 09:34:13 crc kubenswrapper[4841]: I0313 09:34:13.785721 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerName="ceilometer-notification-agent" containerID="cri-o://32cb874977afd54dfe0c446521e716f3e45a3cd182c579fdb9d7ed0696e369f6" gracePeriod=30 Mar 13 09:34:14 crc kubenswrapper[4841]: I0313 09:34:14.010745 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7d41010-f9ab-45b0-9d21-e05037f26651" path="/var/lib/kubelet/pods/a7d41010-f9ab-45b0-9d21-e05037f26651/volumes" Mar 13 09:34:14 crc kubenswrapper[4841]: I0313 09:34:14.210436 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 09:34:14 crc kubenswrapper[4841]: I0313 09:34:14.210437 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 09:34:14 crc kubenswrapper[4841]: I0313 09:34:14.480904 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8b082eb0-dc81-49f6-a313-07507e296c71","Type":"ContainerStarted","Data":"e08e67b43cfe03c2e573247f435483aff272d3e6b625dd05f818ba0f1cb1323f"} Mar 13 09:34:14 crc kubenswrapper[4841]: I0313 09:34:14.481314 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8b082eb0-dc81-49f6-a313-07507e296c71","Type":"ContainerStarted","Data":"a07d914395a0ad1c707afd3877e50356f04aa747a1049a57f36d597b77235d07"} Mar 13 09:34:14 crc kubenswrapper[4841]: I0313 09:34:14.483469 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 09:34:14 crc kubenswrapper[4841]: I0313 09:34:14.488879 4841 generic.go:334] "Generic (PLEG): container finished" podID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerID="e9ec10c9fb528c6843eca05c975eccb635bba35d0ea2b36da5197939ca9d0f98" exitCode=0 Mar 13 09:34:14 crc kubenswrapper[4841]: I0313 09:34:14.488926 4841 generic.go:334] "Generic (PLEG): container finished" podID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerID="928d5bedbb474d3290d23c97d2a184507b4b71f55edf64936973c86202a9d5a8" exitCode=2 Mar 13 09:34:14 crc kubenswrapper[4841]: I0313 09:34:14.488937 4841 generic.go:334] "Generic (PLEG): container finished" podID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerID="9bce5e1595158aa284fda9a35039119d9d367974e7f318e860bd9bbc3d94c1b0" exitCode=0 Mar 13 09:34:14 crc kubenswrapper[4841]: I0313 09:34:14.488963 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb595f57-6b23-4ac9-b25a-3d63159349ad","Type":"ContainerDied","Data":"e9ec10c9fb528c6843eca05c975eccb635bba35d0ea2b36da5197939ca9d0f98"} Mar 13 09:34:14 crc kubenswrapper[4841]: I0313 09:34:14.488987 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb595f57-6b23-4ac9-b25a-3d63159349ad","Type":"ContainerDied","Data":"928d5bedbb474d3290d23c97d2a184507b4b71f55edf64936973c86202a9d5a8"} Mar 13 09:34:14 crc kubenswrapper[4841]: I0313 09:34:14.488998 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb595f57-6b23-4ac9-b25a-3d63159349ad","Type":"ContainerDied","Data":"9bce5e1595158aa284fda9a35039119d9d367974e7f318e860bd9bbc3d94c1b0"} Mar 13 09:34:14 crc kubenswrapper[4841]: I0313 09:34:14.501702 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.999206355 podStartE2EDuration="2.501687172s" podCreationTimestamp="2026-03-13 09:34:12 +0000 UTC" firstStartedPulling="2026-03-13 09:34:13.683390091 +0000 UTC m=+1336.413290282" lastFinishedPulling="2026-03-13 09:34:14.185870908 +0000 UTC m=+1336.915771099" observedRunningTime="2026-03-13 09:34:14.499826384 +0000 UTC m=+1337.229726605" watchObservedRunningTime="2026-03-13 09:34:14.501687172 +0000 UTC m=+1337.231587363" Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.512562 4841 generic.go:334] "Generic (PLEG): container finished" podID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerID="32cb874977afd54dfe0c446521e716f3e45a3cd182c579fdb9d7ed0696e369f6" exitCode=0 Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.512666 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb595f57-6b23-4ac9-b25a-3d63159349ad","Type":"ContainerDied","Data":"32cb874977afd54dfe0c446521e716f3e45a3cd182c579fdb9d7ed0696e369f6"} Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.705299 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.859956 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-scripts\") pod \"fb595f57-6b23-4ac9-b25a-3d63159349ad\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.860056 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-sg-core-conf-yaml\") pod \"fb595f57-6b23-4ac9-b25a-3d63159349ad\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.860136 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6pj8\" (UniqueName: \"kubernetes.io/projected/fb595f57-6b23-4ac9-b25a-3d63159349ad-kube-api-access-b6pj8\") pod \"fb595f57-6b23-4ac9-b25a-3d63159349ad\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.860183 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-combined-ca-bundle\") pod \"fb595f57-6b23-4ac9-b25a-3d63159349ad\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.860321 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb595f57-6b23-4ac9-b25a-3d63159349ad-run-httpd\") pod \"fb595f57-6b23-4ac9-b25a-3d63159349ad\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.861253 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb595f57-6b23-4ac9-b25a-3d63159349ad-log-httpd\") pod \"fb595f57-6b23-4ac9-b25a-3d63159349ad\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.861370 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-config-data\") pod \"fb595f57-6b23-4ac9-b25a-3d63159349ad\" (UID: \"fb595f57-6b23-4ac9-b25a-3d63159349ad\") " Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.861512 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb595f57-6b23-4ac9-b25a-3d63159349ad-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fb595f57-6b23-4ac9-b25a-3d63159349ad" (UID: "fb595f57-6b23-4ac9-b25a-3d63159349ad"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.861831 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb595f57-6b23-4ac9-b25a-3d63159349ad-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fb595f57-6b23-4ac9-b25a-3d63159349ad" (UID: "fb595f57-6b23-4ac9-b25a-3d63159349ad"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.863436 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb595f57-6b23-4ac9-b25a-3d63159349ad-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.863550 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fb595f57-6b23-4ac9-b25a-3d63159349ad-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.866074 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb595f57-6b23-4ac9-b25a-3d63159349ad-kube-api-access-b6pj8" (OuterVolumeSpecName: "kube-api-access-b6pj8") pod "fb595f57-6b23-4ac9-b25a-3d63159349ad" (UID: "fb595f57-6b23-4ac9-b25a-3d63159349ad"). InnerVolumeSpecName "kube-api-access-b6pj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.870844 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-scripts" (OuterVolumeSpecName: "scripts") pod "fb595f57-6b23-4ac9-b25a-3d63159349ad" (UID: "fb595f57-6b23-4ac9-b25a-3d63159349ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.890879 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fb595f57-6b23-4ac9-b25a-3d63159349ad" (UID: "fb595f57-6b23-4ac9-b25a-3d63159349ad"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.941462 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb595f57-6b23-4ac9-b25a-3d63159349ad" (UID: "fb595f57-6b23-4ac9-b25a-3d63159349ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.963497 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-config-data" (OuterVolumeSpecName: "config-data") pod "fb595f57-6b23-4ac9-b25a-3d63159349ad" (UID: "fb595f57-6b23-4ac9-b25a-3d63159349ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.965940 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.966064 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6pj8\" (UniqueName: \"kubernetes.io/projected/fb595f57-6b23-4ac9-b25a-3d63159349ad-kube-api-access-b6pj8\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.966133 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.966190 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:16 crc kubenswrapper[4841]: I0313 09:34:16.966239 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb595f57-6b23-4ac9-b25a-3d63159349ad-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.066041 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.109773 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.526460 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fb595f57-6b23-4ac9-b25a-3d63159349ad","Type":"ContainerDied","Data":"74f52791398ce0a76bb374f4247b414c40dd8fe699543141c3002bdfe23920ac"} Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.526569 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.526824 4841 scope.go:117] "RemoveContainer" containerID="e9ec10c9fb528c6843eca05c975eccb635bba35d0ea2b36da5197939ca9d0f98" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.556932 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.572081 4841 scope.go:117] "RemoveContainer" containerID="928d5bedbb474d3290d23c97d2a184507b4b71f55edf64936973c86202a9d5a8" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.586962 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.605458 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.612550 4841 scope.go:117] "RemoveContainer" containerID="32cb874977afd54dfe0c446521e716f3e45a3cd182c579fdb9d7ed0696e369f6" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.645000 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:34:17 crc kubenswrapper[4841]: E0313 09:34:17.645818 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerName="ceilometer-notification-agent" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.645863 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerName="ceilometer-notification-agent" Mar 13 09:34:17 crc kubenswrapper[4841]: E0313 09:34:17.645875 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerName="proxy-httpd" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.645881 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerName="proxy-httpd" Mar 13 09:34:17 crc kubenswrapper[4841]: E0313 09:34:17.645892 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerName="ceilometer-central-agent" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.645898 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerName="ceilometer-central-agent" Mar 13 09:34:17 crc kubenswrapper[4841]: E0313 09:34:17.645931 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerName="sg-core" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.645939 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerName="sg-core" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.646463 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerName="proxy-httpd" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.646477 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerName="sg-core" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.646493 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerName="ceilometer-notification-agent" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.646626 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb595f57-6b23-4ac9-b25a-3d63159349ad" containerName="ceilometer-central-agent" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.651419 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.652437 4841 scope.go:117] "RemoveContainer" containerID="9bce5e1595158aa284fda9a35039119d9d367974e7f318e860bd9bbc3d94c1b0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.654366 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.654744 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.654988 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.655056 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.779691 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c87fd14-3507-4299-8c83-f841627e52af-log-httpd\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.779831 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpx6v\" (UniqueName: \"kubernetes.io/projected/7c87fd14-3507-4299-8c83-f841627e52af-kube-api-access-fpx6v\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.779859 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.779916 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-config-data\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.779954 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-scripts\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.779979 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c87fd14-3507-4299-8c83-f841627e52af-run-httpd\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.780002 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.780038 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.840345 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.840413 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.882033 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.882157 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c87fd14-3507-4299-8c83-f841627e52af-log-httpd\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.882341 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpx6v\" (UniqueName: \"kubernetes.io/projected/7c87fd14-3507-4299-8c83-f841627e52af-kube-api-access-fpx6v\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.882383 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.882657 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-config-data\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.882719 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-scripts\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.882760 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c87fd14-3507-4299-8c83-f841627e52af-run-httpd\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.882794 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.883961 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c87fd14-3507-4299-8c83-f841627e52af-run-httpd\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.883970 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c87fd14-3507-4299-8c83-f841627e52af-log-httpd\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.887714 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.888017 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.889675 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-scripts\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.895808 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.906178 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-config-data\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:17 crc kubenswrapper[4841]: I0313 09:34:17.906401 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpx6v\" (UniqueName: \"kubernetes.io/projected/7c87fd14-3507-4299-8c83-f841627e52af-kube-api-access-fpx6v\") pod \"ceilometer-0\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " pod="openstack/ceilometer-0" Mar 13 09:34:18 crc kubenswrapper[4841]: I0313 09:34:18.001040 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:34:18 crc kubenswrapper[4841]: I0313 09:34:18.010035 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb595f57-6b23-4ac9-b25a-3d63159349ad" path="/var/lib/kubelet/pods/fb595f57-6b23-4ac9-b25a-3d63159349ad/volumes" Mar 13 09:34:18 crc kubenswrapper[4841]: I0313 09:34:18.492851 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:34:18 crc kubenswrapper[4841]: I0313 09:34:18.540932 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c87fd14-3507-4299-8c83-f841627e52af","Type":"ContainerStarted","Data":"0a15e845923b76e2f40b57888b821f35529e43f1b9276ade78cbe1011c2ff643"} Mar 13 09:34:18 crc kubenswrapper[4841]: I0313 09:34:18.925539 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b5c21ae9-3d60-42b9-9ad0-45490e889799" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 09:34:18 crc kubenswrapper[4841]: I0313 09:34:18.926025 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b5c21ae9-3d60-42b9-9ad0-45490e889799" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 09:34:19 crc kubenswrapper[4841]: I0313 09:34:19.555082 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c87fd14-3507-4299-8c83-f841627e52af","Type":"ContainerStarted","Data":"c80e57a9f72af197ce618cdd09408742cdd0d0bc11ca8bbdb1077e17f73e3256"} Mar 13 09:34:20 crc kubenswrapper[4841]: I0313 09:34:20.564471 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c87fd14-3507-4299-8c83-f841627e52af","Type":"ContainerStarted","Data":"c25631e9cb1d7cc1e0ef8d3e7f53e16997ffefd1887386309a42fb27b77c1569"} Mar 13 09:34:21 crc kubenswrapper[4841]: I0313 09:34:21.577649 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c87fd14-3507-4299-8c83-f841627e52af","Type":"ContainerStarted","Data":"fa61280360ddc89853ac07984bd8a28c6bd42f5052b5659bcdf6def0453be152"} Mar 13 09:34:22 crc kubenswrapper[4841]: I0313 09:34:22.587775 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c87fd14-3507-4299-8c83-f841627e52af","Type":"ContainerStarted","Data":"6fb3c1fe4372508bd1a9db6c87a6ba75befff06af3475756fa4489a5b2ebb162"} Mar 13 09:34:22 crc kubenswrapper[4841]: I0313 09:34:22.588087 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 09:34:23 crc kubenswrapper[4841]: I0313 09:34:23.204029 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 09:34:23 crc kubenswrapper[4841]: I0313 09:34:23.208788 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 09:34:23 crc kubenswrapper[4841]: I0313 09:34:23.214816 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 09:34:23 crc kubenswrapper[4841]: I0313 09:34:23.228829 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 09:34:23 crc kubenswrapper[4841]: I0313 09:34:23.229532 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.451976835 podStartE2EDuration="6.229507865s" podCreationTimestamp="2026-03-13 09:34:17 +0000 UTC" firstStartedPulling="2026-03-13 09:34:18.491397784 +0000 UTC m=+1341.221298015" lastFinishedPulling="2026-03-13 09:34:22.268928854 +0000 UTC m=+1344.998829045" observedRunningTime="2026-03-13 09:34:22.615016133 +0000 UTC m=+1345.344916324" watchObservedRunningTime="2026-03-13 09:34:23.229507865 +0000 UTC m=+1345.959408056" Mar 13 09:34:23 crc kubenswrapper[4841]: I0313 09:34:23.604013 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 09:34:25 crc kubenswrapper[4841]: I0313 09:34:25.586234 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:25 crc kubenswrapper[4841]: I0313 09:34:25.646180 4841 generic.go:334] "Generic (PLEG): container finished" podID="d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5" containerID="9de0d5f74df1552a5305df383535bca63ffd61287636b5abf7bbc907ef6f4ff4" exitCode=137 Mar 13 09:34:25 crc kubenswrapper[4841]: I0313 09:34:25.646295 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5","Type":"ContainerDied","Data":"9de0d5f74df1552a5305df383535bca63ffd61287636b5abf7bbc907ef6f4ff4"} Mar 13 09:34:25 crc kubenswrapper[4841]: I0313 09:34:25.646337 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5","Type":"ContainerDied","Data":"ac1c142759006713d3b3f9f8410508cb5772666398f5948394a1e48fa17e43fb"} Mar 13 09:34:25 crc kubenswrapper[4841]: I0313 09:34:25.646343 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:25 crc kubenswrapper[4841]: I0313 09:34:25.646359 4841 scope.go:117] "RemoveContainer" containerID="9de0d5f74df1552a5305df383535bca63ffd61287636b5abf7bbc907ef6f4ff4" Mar 13 09:34:25 crc kubenswrapper[4841]: I0313 09:34:25.670700 4841 scope.go:117] "RemoveContainer" containerID="9de0d5f74df1552a5305df383535bca63ffd61287636b5abf7bbc907ef6f4ff4" Mar 13 09:34:25 crc kubenswrapper[4841]: E0313 09:34:25.671997 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9de0d5f74df1552a5305df383535bca63ffd61287636b5abf7bbc907ef6f4ff4\": container with ID starting with 9de0d5f74df1552a5305df383535bca63ffd61287636b5abf7bbc907ef6f4ff4 not found: ID does not exist" containerID="9de0d5f74df1552a5305df383535bca63ffd61287636b5abf7bbc907ef6f4ff4" Mar 13 09:34:25 crc kubenswrapper[4841]: I0313 09:34:25.672036 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9de0d5f74df1552a5305df383535bca63ffd61287636b5abf7bbc907ef6f4ff4"} err="failed to get container status \"9de0d5f74df1552a5305df383535bca63ffd61287636b5abf7bbc907ef6f4ff4\": rpc error: code = NotFound desc = could not find container \"9de0d5f74df1552a5305df383535bca63ffd61287636b5abf7bbc907ef6f4ff4\": container with ID starting with 9de0d5f74df1552a5305df383535bca63ffd61287636b5abf7bbc907ef6f4ff4 not found: ID does not exist" Mar 13 09:34:25 crc kubenswrapper[4841]: I0313 09:34:25.735833 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5-config-data\") pod \"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5\" (UID: \"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5\") " Mar 13 09:34:25 crc kubenswrapper[4841]: I0313 09:34:25.735895 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkgsf\" (UniqueName: \"kubernetes.io/projected/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5-kube-api-access-bkgsf\") pod \"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5\" (UID: \"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5\") " Mar 13 09:34:25 crc kubenswrapper[4841]: I0313 09:34:25.735942 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5-combined-ca-bundle\") pod \"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5\" (UID: \"d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5\") " Mar 13 09:34:25 crc kubenswrapper[4841]: I0313 09:34:25.747517 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5-kube-api-access-bkgsf" (OuterVolumeSpecName: "kube-api-access-bkgsf") pod "d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5" (UID: "d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5"). InnerVolumeSpecName "kube-api-access-bkgsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:34:25 crc kubenswrapper[4841]: I0313 09:34:25.775486 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5" (UID: "d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:25 crc kubenswrapper[4841]: I0313 09:34:25.775937 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5-config-data" (OuterVolumeSpecName: "config-data") pod "d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5" (UID: "d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:25 crc kubenswrapper[4841]: I0313 09:34:25.839810 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:25 crc kubenswrapper[4841]: I0313 09:34:25.839856 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkgsf\" (UniqueName: \"kubernetes.io/projected/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5-kube-api-access-bkgsf\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:25 crc kubenswrapper[4841]: I0313 09:34:25.839867 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:25 crc kubenswrapper[4841]: I0313 09:34:25.985861 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.007363 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.014195 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 09:34:26 crc kubenswrapper[4841]: E0313 09:34:26.015238 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.015343 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.015716 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.016979 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.020652 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.020688 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.021088 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.038426 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.156520 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1840df1-8c0f-4038-9389-eaf2bcc61705-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1840df1-8c0f-4038-9389-eaf2bcc61705\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.156852 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbtlp\" (UniqueName: \"kubernetes.io/projected/c1840df1-8c0f-4038-9389-eaf2bcc61705-kube-api-access-bbtlp\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1840df1-8c0f-4038-9389-eaf2bcc61705\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.156980 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1840df1-8c0f-4038-9389-eaf2bcc61705-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1840df1-8c0f-4038-9389-eaf2bcc61705\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.157148 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1840df1-8c0f-4038-9389-eaf2bcc61705-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1840df1-8c0f-4038-9389-eaf2bcc61705\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.157216 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1840df1-8c0f-4038-9389-eaf2bcc61705-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1840df1-8c0f-4038-9389-eaf2bcc61705\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.260513 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbtlp\" (UniqueName: \"kubernetes.io/projected/c1840df1-8c0f-4038-9389-eaf2bcc61705-kube-api-access-bbtlp\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1840df1-8c0f-4038-9389-eaf2bcc61705\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.261100 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1840df1-8c0f-4038-9389-eaf2bcc61705-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1840df1-8c0f-4038-9389-eaf2bcc61705\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.261371 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1840df1-8c0f-4038-9389-eaf2bcc61705-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1840df1-8c0f-4038-9389-eaf2bcc61705\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.261507 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1840df1-8c0f-4038-9389-eaf2bcc61705-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1840df1-8c0f-4038-9389-eaf2bcc61705\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.261726 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1840df1-8c0f-4038-9389-eaf2bcc61705-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1840df1-8c0f-4038-9389-eaf2bcc61705\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.266030 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1840df1-8c0f-4038-9389-eaf2bcc61705-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1840df1-8c0f-4038-9389-eaf2bcc61705\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.266038 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1840df1-8c0f-4038-9389-eaf2bcc61705-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1840df1-8c0f-4038-9389-eaf2bcc61705\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.267230 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1840df1-8c0f-4038-9389-eaf2bcc61705-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1840df1-8c0f-4038-9389-eaf2bcc61705\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.267772 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1840df1-8c0f-4038-9389-eaf2bcc61705-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1840df1-8c0f-4038-9389-eaf2bcc61705\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.280146 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbtlp\" (UniqueName: \"kubernetes.io/projected/c1840df1-8c0f-4038-9389-eaf2bcc61705-kube-api-access-bbtlp\") pod \"nova-cell1-novncproxy-0\" (UID: \"c1840df1-8c0f-4038-9389-eaf2bcc61705\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.351160 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:26 crc kubenswrapper[4841]: I0313 09:34:26.830449 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 09:34:26 crc kubenswrapper[4841]: W0313 09:34:26.841919 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1840df1_8c0f_4038_9389_eaf2bcc61705.slice/crio-3f6593a2dae07b4c419b89292b1ee6f837fe1e17cf3451faaa9bbcb676290e0d WatchSource:0}: Error finding container 3f6593a2dae07b4c419b89292b1ee6f837fe1e17cf3451faaa9bbcb676290e0d: Status 404 returned error can't find the container with id 3f6593a2dae07b4c419b89292b1ee6f837fe1e17cf3451faaa9bbcb676290e0d Mar 13 09:34:27 crc kubenswrapper[4841]: I0313 09:34:27.672603 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c1840df1-8c0f-4038-9389-eaf2bcc61705","Type":"ContainerStarted","Data":"5f8f46e789327f959863c9d33f8c118825d2e0b151dfb46b64c929d21a5f2a9b"} Mar 13 09:34:27 crc kubenswrapper[4841]: I0313 09:34:27.673836 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c1840df1-8c0f-4038-9389-eaf2bcc61705","Type":"ContainerStarted","Data":"3f6593a2dae07b4c419b89292b1ee6f837fe1e17cf3451faaa9bbcb676290e0d"} Mar 13 09:34:27 crc kubenswrapper[4841]: I0313 09:34:27.702577 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.7025503459999998 podStartE2EDuration="2.702550346s" podCreationTimestamp="2026-03-13 09:34:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:34:27.696761525 +0000 UTC m=+1350.426661716" watchObservedRunningTime="2026-03-13 09:34:27.702550346 +0000 UTC m=+1350.432450567" Mar 13 09:34:27 crc kubenswrapper[4841]: I0313 09:34:27.843016 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 09:34:27 crc kubenswrapper[4841]: I0313 09:34:27.843779 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 09:34:27 crc kubenswrapper[4841]: I0313 09:34:27.844651 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 09:34:27 crc kubenswrapper[4841]: I0313 09:34:27.848181 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 09:34:28 crc kubenswrapper[4841]: I0313 09:34:28.007517 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5" path="/var/lib/kubelet/pods/d5dc3bd3-589d-43e4-bd5b-3c4c4815d5d5/volumes" Mar 13 09:34:28 crc kubenswrapper[4841]: I0313 09:34:28.691462 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 09:34:28 crc kubenswrapper[4841]: I0313 09:34:28.697526 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 09:34:28 crc kubenswrapper[4841]: I0313 09:34:28.910163 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7"] Mar 13 09:34:28 crc kubenswrapper[4841]: I0313 09:34:28.912091 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:28 crc kubenswrapper[4841]: I0313 09:34:28.928503 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7"] Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.036710 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bnmq\" (UniqueName: \"kubernetes.io/projected/f0ee1a9a-154c-4c27-b964-94a9a13761e6-kube-api-access-8bnmq\") pod \"dnsmasq-dns-6b7bbf7cf9-m6xj7\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.037028 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-m6xj7\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.037057 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-config\") pod \"dnsmasq-dns-6b7bbf7cf9-m6xj7\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.037115 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-m6xj7\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.037182 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-m6xj7\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.037203 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-m6xj7\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.139805 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-m6xj7\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.140349 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-m6xj7\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.140431 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-m6xj7\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.140605 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bnmq\" (UniqueName: \"kubernetes.io/projected/f0ee1a9a-154c-4c27-b964-94a9a13761e6-kube-api-access-8bnmq\") pod \"dnsmasq-dns-6b7bbf7cf9-m6xj7\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.140725 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-m6xj7\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.140751 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-m6xj7\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.140783 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-config\") pod \"dnsmasq-dns-6b7bbf7cf9-m6xj7\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.141014 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-m6xj7\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.141955 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-config\") pod \"dnsmasq-dns-6b7bbf7cf9-m6xj7\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.142975 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-m6xj7\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.142992 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-m6xj7\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.163362 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bnmq\" (UniqueName: \"kubernetes.io/projected/f0ee1a9a-154c-4c27-b964-94a9a13761e6-kube-api-access-8bnmq\") pod \"dnsmasq-dns-6b7bbf7cf9-m6xj7\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.241724 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:29 crc kubenswrapper[4841]: I0313 09:34:29.769028 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7"] Mar 13 09:34:29 crc kubenswrapper[4841]: W0313 09:34:29.774434 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0ee1a9a_154c_4c27_b964_94a9a13761e6.slice/crio-9d7ce585160cd440cbabcbe12d43876f016a0da5ee659a031293e9a2e5298dc3 WatchSource:0}: Error finding container 9d7ce585160cd440cbabcbe12d43876f016a0da5ee659a031293e9a2e5298dc3: Status 404 returned error can't find the container with id 9d7ce585160cd440cbabcbe12d43876f016a0da5ee659a031293e9a2e5298dc3 Mar 13 09:34:30 crc kubenswrapper[4841]: I0313 09:34:30.707831 4841 generic.go:334] "Generic (PLEG): container finished" podID="f0ee1a9a-154c-4c27-b964-94a9a13761e6" containerID="ed3d5bc50518252c2ee01df8344e1617eff8e0bb9a7ec2dd76286af196a5d4c8" exitCode=0 Mar 13 09:34:30 crc kubenswrapper[4841]: I0313 09:34:30.708302 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" event={"ID":"f0ee1a9a-154c-4c27-b964-94a9a13761e6","Type":"ContainerDied","Data":"ed3d5bc50518252c2ee01df8344e1617eff8e0bb9a7ec2dd76286af196a5d4c8"} Mar 13 09:34:30 crc kubenswrapper[4841]: I0313 09:34:30.708348 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" event={"ID":"f0ee1a9a-154c-4c27-b964-94a9a13761e6","Type":"ContainerStarted","Data":"9d7ce585160cd440cbabcbe12d43876f016a0da5ee659a031293e9a2e5298dc3"} Mar 13 09:34:30 crc kubenswrapper[4841]: I0313 09:34:30.907070 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 09:34:31 crc kubenswrapper[4841]: I0313 09:34:31.351906 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:31 crc kubenswrapper[4841]: I0313 09:34:31.504658 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:34:31 crc kubenswrapper[4841]: I0313 09:34:31.505320 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7c87fd14-3507-4299-8c83-f841627e52af" containerName="ceilometer-central-agent" containerID="cri-o://c80e57a9f72af197ce618cdd09408742cdd0d0bc11ca8bbdb1077e17f73e3256" gracePeriod=30 Mar 13 09:34:31 crc kubenswrapper[4841]: I0313 09:34:31.505815 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7c87fd14-3507-4299-8c83-f841627e52af" containerName="proxy-httpd" containerID="cri-o://6fb3c1fe4372508bd1a9db6c87a6ba75befff06af3475756fa4489a5b2ebb162" gracePeriod=30 Mar 13 09:34:31 crc kubenswrapper[4841]: I0313 09:34:31.505878 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7c87fd14-3507-4299-8c83-f841627e52af" containerName="sg-core" containerID="cri-o://fa61280360ddc89853ac07984bd8a28c6bd42f5052b5659bcdf6def0453be152" gracePeriod=30 Mar 13 09:34:31 crc kubenswrapper[4841]: I0313 09:34:31.505917 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7c87fd14-3507-4299-8c83-f841627e52af" containerName="ceilometer-notification-agent" containerID="cri-o://c25631e9cb1d7cc1e0ef8d3e7f53e16997ffefd1887386309a42fb27b77c1569" gracePeriod=30 Mar 13 09:34:31 crc kubenswrapper[4841]: I0313 09:34:31.722301 4841 generic.go:334] "Generic (PLEG): container finished" podID="7c87fd14-3507-4299-8c83-f841627e52af" containerID="6fb3c1fe4372508bd1a9db6c87a6ba75befff06af3475756fa4489a5b2ebb162" exitCode=0 Mar 13 09:34:31 crc kubenswrapper[4841]: I0313 09:34:31.722332 4841 generic.go:334] "Generic (PLEG): container finished" podID="7c87fd14-3507-4299-8c83-f841627e52af" containerID="fa61280360ddc89853ac07984bd8a28c6bd42f5052b5659bcdf6def0453be152" exitCode=2 Mar 13 09:34:31 crc kubenswrapper[4841]: I0313 09:34:31.722410 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c87fd14-3507-4299-8c83-f841627e52af","Type":"ContainerDied","Data":"6fb3c1fe4372508bd1a9db6c87a6ba75befff06af3475756fa4489a5b2ebb162"} Mar 13 09:34:31 crc kubenswrapper[4841]: I0313 09:34:31.722466 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c87fd14-3507-4299-8c83-f841627e52af","Type":"ContainerDied","Data":"fa61280360ddc89853ac07984bd8a28c6bd42f5052b5659bcdf6def0453be152"} Mar 13 09:34:31 crc kubenswrapper[4841]: I0313 09:34:31.726477 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" event={"ID":"f0ee1a9a-154c-4c27-b964-94a9a13761e6","Type":"ContainerStarted","Data":"02891e9fbf9cbcc1dc7c603e4f651eaa523df9e408523cfa4d91084e62782e9e"} Mar 13 09:34:31 crc kubenswrapper[4841]: I0313 09:34:31.726592 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:31 crc kubenswrapper[4841]: I0313 09:34:31.727071 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b5c21ae9-3d60-42b9-9ad0-45490e889799" containerName="nova-api-log" containerID="cri-o://c2fd43fb8aa09b3299822800c11453bdb427179b32c52d3bbd014b9197eb3903" gracePeriod=30 Mar 13 09:34:31 crc kubenswrapper[4841]: I0313 09:34:31.727243 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b5c21ae9-3d60-42b9-9ad0-45490e889799" containerName="nova-api-api" containerID="cri-o://baecf9d54bc0c1abd9cdc73f3fb63e48a0284c58a3d6d8223ee59c70e07a4d89" gracePeriod=30 Mar 13 09:34:31 crc kubenswrapper[4841]: I0313 09:34:31.751699 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" podStartSLOduration=3.751679471 podStartE2EDuration="3.751679471s" podCreationTimestamp="2026-03-13 09:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:34:31.743806806 +0000 UTC m=+1354.473707017" watchObservedRunningTime="2026-03-13 09:34:31.751679471 +0000 UTC m=+1354.481579662" Mar 13 09:34:32 crc kubenswrapper[4841]: I0313 09:34:32.761489 4841 generic.go:334] "Generic (PLEG): container finished" podID="b5c21ae9-3d60-42b9-9ad0-45490e889799" containerID="c2fd43fb8aa09b3299822800c11453bdb427179b32c52d3bbd014b9197eb3903" exitCode=143 Mar 13 09:34:32 crc kubenswrapper[4841]: I0313 09:34:32.761872 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5c21ae9-3d60-42b9-9ad0-45490e889799","Type":"ContainerDied","Data":"c2fd43fb8aa09b3299822800c11453bdb427179b32c52d3bbd014b9197eb3903"} Mar 13 09:34:32 crc kubenswrapper[4841]: I0313 09:34:32.771255 4841 generic.go:334] "Generic (PLEG): container finished" podID="7c87fd14-3507-4299-8c83-f841627e52af" containerID="c25631e9cb1d7cc1e0ef8d3e7f53e16997ffefd1887386309a42fb27b77c1569" exitCode=0 Mar 13 09:34:32 crc kubenswrapper[4841]: I0313 09:34:32.771293 4841 generic.go:334] "Generic (PLEG): container finished" podID="7c87fd14-3507-4299-8c83-f841627e52af" containerID="c80e57a9f72af197ce618cdd09408742cdd0d0bc11ca8bbdb1077e17f73e3256" exitCode=0 Mar 13 09:34:32 crc kubenswrapper[4841]: I0313 09:34:32.771860 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c87fd14-3507-4299-8c83-f841627e52af","Type":"ContainerDied","Data":"c25631e9cb1d7cc1e0ef8d3e7f53e16997ffefd1887386309a42fb27b77c1569"} Mar 13 09:34:32 crc kubenswrapper[4841]: I0313 09:34:32.771926 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c87fd14-3507-4299-8c83-f841627e52af","Type":"ContainerDied","Data":"c80e57a9f72af197ce618cdd09408742cdd0d0bc11ca8bbdb1077e17f73e3256"} Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.099793 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.217167 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-ceilometer-tls-certs\") pod \"7c87fd14-3507-4299-8c83-f841627e52af\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.217244 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-sg-core-conf-yaml\") pod \"7c87fd14-3507-4299-8c83-f841627e52af\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.217291 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c87fd14-3507-4299-8c83-f841627e52af-log-httpd\") pod \"7c87fd14-3507-4299-8c83-f841627e52af\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.217352 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-scripts\") pod \"7c87fd14-3507-4299-8c83-f841627e52af\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.217465 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-config-data\") pod \"7c87fd14-3507-4299-8c83-f841627e52af\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.217551 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c87fd14-3507-4299-8c83-f841627e52af-run-httpd\") pod \"7c87fd14-3507-4299-8c83-f841627e52af\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.217579 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-combined-ca-bundle\") pod \"7c87fd14-3507-4299-8c83-f841627e52af\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.217617 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpx6v\" (UniqueName: \"kubernetes.io/projected/7c87fd14-3507-4299-8c83-f841627e52af-kube-api-access-fpx6v\") pod \"7c87fd14-3507-4299-8c83-f841627e52af\" (UID: \"7c87fd14-3507-4299-8c83-f841627e52af\") " Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.217859 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c87fd14-3507-4299-8c83-f841627e52af-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7c87fd14-3507-4299-8c83-f841627e52af" (UID: "7c87fd14-3507-4299-8c83-f841627e52af"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.217991 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c87fd14-3507-4299-8c83-f841627e52af-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7c87fd14-3507-4299-8c83-f841627e52af" (UID: "7c87fd14-3507-4299-8c83-f841627e52af"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.218461 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c87fd14-3507-4299-8c83-f841627e52af-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.218487 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c87fd14-3507-4299-8c83-f841627e52af-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.223462 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-scripts" (OuterVolumeSpecName: "scripts") pod "7c87fd14-3507-4299-8c83-f841627e52af" (UID: "7c87fd14-3507-4299-8c83-f841627e52af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.229428 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c87fd14-3507-4299-8c83-f841627e52af-kube-api-access-fpx6v" (OuterVolumeSpecName: "kube-api-access-fpx6v") pod "7c87fd14-3507-4299-8c83-f841627e52af" (UID: "7c87fd14-3507-4299-8c83-f841627e52af"). InnerVolumeSpecName "kube-api-access-fpx6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.255930 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7c87fd14-3507-4299-8c83-f841627e52af" (UID: "7c87fd14-3507-4299-8c83-f841627e52af"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.299828 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c87fd14-3507-4299-8c83-f841627e52af" (UID: "7c87fd14-3507-4299-8c83-f841627e52af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.305014 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7c87fd14-3507-4299-8c83-f841627e52af" (UID: "7c87fd14-3507-4299-8c83-f841627e52af"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.320395 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.320427 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.320442 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpx6v\" (UniqueName: \"kubernetes.io/projected/7c87fd14-3507-4299-8c83-f841627e52af-kube-api-access-fpx6v\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.320456 4841 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.320467 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.338547 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-config-data" (OuterVolumeSpecName: "config-data") pod "7c87fd14-3507-4299-8c83-f841627e52af" (UID: "7c87fd14-3507-4299-8c83-f841627e52af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.422210 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c87fd14-3507-4299-8c83-f841627e52af-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.782746 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7c87fd14-3507-4299-8c83-f841627e52af","Type":"ContainerDied","Data":"0a15e845923b76e2f40b57888b821f35529e43f1b9276ade78cbe1011c2ff643"} Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.782795 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.782801 4841 scope.go:117] "RemoveContainer" containerID="6fb3c1fe4372508bd1a9db6c87a6ba75befff06af3475756fa4489a5b2ebb162" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.806766 4841 scope.go:117] "RemoveContainer" containerID="fa61280360ddc89853ac07984bd8a28c6bd42f5052b5659bcdf6def0453be152" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.817427 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.827110 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.827141 4841 scope.go:117] "RemoveContainer" containerID="c25631e9cb1d7cc1e0ef8d3e7f53e16997ffefd1887386309a42fb27b77c1569" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.850624 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:34:33 crc kubenswrapper[4841]: E0313 09:34:33.851110 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c87fd14-3507-4299-8c83-f841627e52af" containerName="sg-core" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.851129 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c87fd14-3507-4299-8c83-f841627e52af" containerName="sg-core" Mar 13 09:34:33 crc kubenswrapper[4841]: E0313 09:34:33.851148 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c87fd14-3507-4299-8c83-f841627e52af" containerName="proxy-httpd" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.851157 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c87fd14-3507-4299-8c83-f841627e52af" containerName="proxy-httpd" Mar 13 09:34:33 crc kubenswrapper[4841]: E0313 09:34:33.851167 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c87fd14-3507-4299-8c83-f841627e52af" containerName="ceilometer-central-agent" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.851174 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c87fd14-3507-4299-8c83-f841627e52af" containerName="ceilometer-central-agent" Mar 13 09:34:33 crc kubenswrapper[4841]: E0313 09:34:33.851188 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c87fd14-3507-4299-8c83-f841627e52af" containerName="ceilometer-notification-agent" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.851196 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c87fd14-3507-4299-8c83-f841627e52af" containerName="ceilometer-notification-agent" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.852088 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c87fd14-3507-4299-8c83-f841627e52af" containerName="ceilometer-notification-agent" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.852114 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c87fd14-3507-4299-8c83-f841627e52af" containerName="proxy-httpd" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.852131 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c87fd14-3507-4299-8c83-f841627e52af" containerName="ceilometer-central-agent" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.852152 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c87fd14-3507-4299-8c83-f841627e52af" containerName="sg-core" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.854386 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.856731 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.856958 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.857143 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.870225 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:34:33 crc kubenswrapper[4841]: I0313 09:34:33.872671 4841 scope.go:117] "RemoveContainer" containerID="c80e57a9f72af197ce618cdd09408742cdd0d0bc11ca8bbdb1077e17f73e3256" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.005257 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c87fd14-3507-4299-8c83-f841627e52af" path="/var/lib/kubelet/pods/7c87fd14-3507-4299-8c83-f841627e52af/volumes" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.033922 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-run-httpd\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.033982 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-log-httpd\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.034074 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-scripts\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.034106 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.034146 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.034168 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.034196 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-config-data\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.034258 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z86j\" (UniqueName: \"kubernetes.io/projected/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-kube-api-access-5z86j\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.136444 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-run-httpd\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.136491 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-log-httpd\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.136562 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-scripts\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.136605 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.136636 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.136655 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.136676 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-config-data\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.136742 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z86j\" (UniqueName: \"kubernetes.io/projected/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-kube-api-access-5z86j\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.137585 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-log-httpd\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.137615 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-run-httpd\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.141072 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-scripts\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.141141 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.141782 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.148485 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-config-data\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.151866 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.154561 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z86j\" (UniqueName: \"kubernetes.io/projected/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-kube-api-access-5z86j\") pod \"ceilometer-0\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.212880 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.224359 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.407259 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.407580 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:34:34 crc kubenswrapper[4841]: W0313 09:34:34.701466 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bbfc9f4_d3c9_4dd0_a242_a0e7b889b402.slice/crio-4019224d832d5f60ebbff563821fb5052b48ee454a7864e087deff89535d5a68 WatchSource:0}: Error finding container 4019224d832d5f60ebbff563821fb5052b48ee454a7864e087deff89535d5a68: Status 404 returned error can't find the container with id 4019224d832d5f60ebbff563821fb5052b48ee454a7864e087deff89535d5a68 Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.710526 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:34:34 crc kubenswrapper[4841]: I0313 09:34:34.793218 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402","Type":"ContainerStarted","Data":"4019224d832d5f60ebbff563821fb5052b48ee454a7864e087deff89535d5a68"} Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.307364 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.483614 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnptr\" (UniqueName: \"kubernetes.io/projected/b5c21ae9-3d60-42b9-9ad0-45490e889799-kube-api-access-fnptr\") pod \"b5c21ae9-3d60-42b9-9ad0-45490e889799\" (UID: \"b5c21ae9-3d60-42b9-9ad0-45490e889799\") " Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.484035 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5c21ae9-3d60-42b9-9ad0-45490e889799-logs\") pod \"b5c21ae9-3d60-42b9-9ad0-45490e889799\" (UID: \"b5c21ae9-3d60-42b9-9ad0-45490e889799\") " Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.484099 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5c21ae9-3d60-42b9-9ad0-45490e889799-config-data\") pod \"b5c21ae9-3d60-42b9-9ad0-45490e889799\" (UID: \"b5c21ae9-3d60-42b9-9ad0-45490e889799\") " Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.484131 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5c21ae9-3d60-42b9-9ad0-45490e889799-combined-ca-bundle\") pod \"b5c21ae9-3d60-42b9-9ad0-45490e889799\" (UID: \"b5c21ae9-3d60-42b9-9ad0-45490e889799\") " Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.484560 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5c21ae9-3d60-42b9-9ad0-45490e889799-logs" (OuterVolumeSpecName: "logs") pod "b5c21ae9-3d60-42b9-9ad0-45490e889799" (UID: "b5c21ae9-3d60-42b9-9ad0-45490e889799"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.490232 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5c21ae9-3d60-42b9-9ad0-45490e889799-kube-api-access-fnptr" (OuterVolumeSpecName: "kube-api-access-fnptr") pod "b5c21ae9-3d60-42b9-9ad0-45490e889799" (UID: "b5c21ae9-3d60-42b9-9ad0-45490e889799"). InnerVolumeSpecName "kube-api-access-fnptr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.519539 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5c21ae9-3d60-42b9-9ad0-45490e889799-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5c21ae9-3d60-42b9-9ad0-45490e889799" (UID: "b5c21ae9-3d60-42b9-9ad0-45490e889799"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.526444 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5c21ae9-3d60-42b9-9ad0-45490e889799-config-data" (OuterVolumeSpecName: "config-data") pod "b5c21ae9-3d60-42b9-9ad0-45490e889799" (UID: "b5c21ae9-3d60-42b9-9ad0-45490e889799"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.585614 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnptr\" (UniqueName: \"kubernetes.io/projected/b5c21ae9-3d60-42b9-9ad0-45490e889799-kube-api-access-fnptr\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.585643 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5c21ae9-3d60-42b9-9ad0-45490e889799-logs\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.585652 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5c21ae9-3d60-42b9-9ad0-45490e889799-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.585661 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5c21ae9-3d60-42b9-9ad0-45490e889799-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.809040 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402","Type":"ContainerStarted","Data":"255f0209ff03332bbade9cc9934cd134a40b830d10c4687630898f2dd52f50b7"} Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.810246 4841 generic.go:334] "Generic (PLEG): container finished" podID="b5c21ae9-3d60-42b9-9ad0-45490e889799" containerID="baecf9d54bc0c1abd9cdc73f3fb63e48a0284c58a3d6d8223ee59c70e07a4d89" exitCode=0 Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.810371 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5c21ae9-3d60-42b9-9ad0-45490e889799","Type":"ContainerDied","Data":"baecf9d54bc0c1abd9cdc73f3fb63e48a0284c58a3d6d8223ee59c70e07a4d89"} Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.810389 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5c21ae9-3d60-42b9-9ad0-45490e889799","Type":"ContainerDied","Data":"c8648a06c9be50812eb3c0ea184dc5021ce529ac0f6338b587cc361f5ac32b48"} Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.810404 4841 scope.go:117] "RemoveContainer" containerID="baecf9d54bc0c1abd9cdc73f3fb63e48a0284c58a3d6d8223ee59c70e07a4d89" Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.810459 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.952378 4841 scope.go:117] "RemoveContainer" containerID="c2fd43fb8aa09b3299822800c11453bdb427179b32c52d3bbd014b9197eb3903" Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.974795 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.988044 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.989863 4841 scope.go:117] "RemoveContainer" containerID="baecf9d54bc0c1abd9cdc73f3fb63e48a0284c58a3d6d8223ee59c70e07a4d89" Mar 13 09:34:35 crc kubenswrapper[4841]: E0313 09:34:35.990464 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baecf9d54bc0c1abd9cdc73f3fb63e48a0284c58a3d6d8223ee59c70e07a4d89\": container with ID starting with baecf9d54bc0c1abd9cdc73f3fb63e48a0284c58a3d6d8223ee59c70e07a4d89 not found: ID does not exist" containerID="baecf9d54bc0c1abd9cdc73f3fb63e48a0284c58a3d6d8223ee59c70e07a4d89" Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.990638 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baecf9d54bc0c1abd9cdc73f3fb63e48a0284c58a3d6d8223ee59c70e07a4d89"} err="failed to get container status \"baecf9d54bc0c1abd9cdc73f3fb63e48a0284c58a3d6d8223ee59c70e07a4d89\": rpc error: code = NotFound desc = could not find container \"baecf9d54bc0c1abd9cdc73f3fb63e48a0284c58a3d6d8223ee59c70e07a4d89\": container with ID starting with baecf9d54bc0c1abd9cdc73f3fb63e48a0284c58a3d6d8223ee59c70e07a4d89 not found: ID does not exist" Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.991085 4841 scope.go:117] "RemoveContainer" containerID="c2fd43fb8aa09b3299822800c11453bdb427179b32c52d3bbd014b9197eb3903" Mar 13 09:34:35 crc kubenswrapper[4841]: E0313 09:34:35.991665 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2fd43fb8aa09b3299822800c11453bdb427179b32c52d3bbd014b9197eb3903\": container with ID starting with c2fd43fb8aa09b3299822800c11453bdb427179b32c52d3bbd014b9197eb3903 not found: ID does not exist" containerID="c2fd43fb8aa09b3299822800c11453bdb427179b32c52d3bbd014b9197eb3903" Mar 13 09:34:35 crc kubenswrapper[4841]: I0313 09:34:35.991690 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2fd43fb8aa09b3299822800c11453bdb427179b32c52d3bbd014b9197eb3903"} err="failed to get container status \"c2fd43fb8aa09b3299822800c11453bdb427179b32c52d3bbd014b9197eb3903\": rpc error: code = NotFound desc = could not find container \"c2fd43fb8aa09b3299822800c11453bdb427179b32c52d3bbd014b9197eb3903\": container with ID starting with c2fd43fb8aa09b3299822800c11453bdb427179b32c52d3bbd014b9197eb3903 not found: ID does not exist" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.009974 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5c21ae9-3d60-42b9-9ad0-45490e889799" path="/var/lib/kubelet/pods/b5c21ae9-3d60-42b9-9ad0-45490e889799/volumes" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.011037 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 09:34:36 crc kubenswrapper[4841]: E0313 09:34:36.011464 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c21ae9-3d60-42b9-9ad0-45490e889799" containerName="nova-api-api" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.011561 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c21ae9-3d60-42b9-9ad0-45490e889799" containerName="nova-api-api" Mar 13 09:34:36 crc kubenswrapper[4841]: E0313 09:34:36.011654 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c21ae9-3d60-42b9-9ad0-45490e889799" containerName="nova-api-log" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.011714 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c21ae9-3d60-42b9-9ad0-45490e889799" containerName="nova-api-log" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.011993 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5c21ae9-3d60-42b9-9ad0-45490e889799" containerName="nova-api-api" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.012073 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5c21ae9-3d60-42b9-9ad0-45490e889799" containerName="nova-api-log" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.015345 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.018671 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.018870 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.019023 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.021675 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.196609 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.196862 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpf4p\" (UniqueName: \"kubernetes.io/projected/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-kube-api-access-lpf4p\") pod \"nova-api-0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.196891 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.196978 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-logs\") pod \"nova-api-0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.197010 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-config-data\") pod \"nova-api-0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.197039 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-public-tls-certs\") pod \"nova-api-0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.298686 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-logs\") pod \"nova-api-0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.298757 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-config-data\") pod \"nova-api-0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.298785 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-public-tls-certs\") pod \"nova-api-0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.298844 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.298868 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpf4p\" (UniqueName: \"kubernetes.io/projected/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-kube-api-access-lpf4p\") pod \"nova-api-0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.298892 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.299257 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-logs\") pod \"nova-api-0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.303108 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.304168 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-config-data\") pod \"nova-api-0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.304338 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-public-tls-certs\") pod \"nova-api-0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.319739 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.326202 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpf4p\" (UniqueName: \"kubernetes.io/projected/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-kube-api-access-lpf4p\") pod \"nova-api-0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.346555 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.352674 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.375899 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.832556 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402","Type":"ContainerStarted","Data":"facfda0d0243fe6ca1ec9794d38b78f9f1a5433e9db96368c9a0224748579dca"} Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.832798 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402","Type":"ContainerStarted","Data":"47eb7f52fbaf44de56febb41ab8af5f76ce5fbe88b47970a101cdf1793b54848"} Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.847490 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 09:34:36 crc kubenswrapper[4841]: I0313 09:34:36.857171 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.172946 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-5rvmg"] Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.174701 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5rvmg" Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.178638 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.178766 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.190619 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5rvmg"] Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.328813 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-scripts\") pod \"nova-cell1-cell-mapping-5rvmg\" (UID: \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\") " pod="openstack/nova-cell1-cell-mapping-5rvmg" Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.328959 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-config-data\") pod \"nova-cell1-cell-mapping-5rvmg\" (UID: \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\") " pod="openstack/nova-cell1-cell-mapping-5rvmg" Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.329086 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5rvmg\" (UID: \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\") " pod="openstack/nova-cell1-cell-mapping-5rvmg" Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.329260 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmpsn\" (UniqueName: \"kubernetes.io/projected/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-kube-api-access-hmpsn\") pod \"nova-cell1-cell-mapping-5rvmg\" (UID: \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\") " pod="openstack/nova-cell1-cell-mapping-5rvmg" Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.431168 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-scripts\") pod \"nova-cell1-cell-mapping-5rvmg\" (UID: \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\") " pod="openstack/nova-cell1-cell-mapping-5rvmg" Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.431326 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-config-data\") pod \"nova-cell1-cell-mapping-5rvmg\" (UID: \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\") " pod="openstack/nova-cell1-cell-mapping-5rvmg" Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.431374 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5rvmg\" (UID: \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\") " pod="openstack/nova-cell1-cell-mapping-5rvmg" Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.431441 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmpsn\" (UniqueName: \"kubernetes.io/projected/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-kube-api-access-hmpsn\") pod \"nova-cell1-cell-mapping-5rvmg\" (UID: \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\") " pod="openstack/nova-cell1-cell-mapping-5rvmg" Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.435700 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-scripts\") pod \"nova-cell1-cell-mapping-5rvmg\" (UID: \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\") " pod="openstack/nova-cell1-cell-mapping-5rvmg" Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.436241 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-config-data\") pod \"nova-cell1-cell-mapping-5rvmg\" (UID: \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\") " pod="openstack/nova-cell1-cell-mapping-5rvmg" Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.436385 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5rvmg\" (UID: \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\") " pod="openstack/nova-cell1-cell-mapping-5rvmg" Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.446145 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmpsn\" (UniqueName: \"kubernetes.io/projected/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-kube-api-access-hmpsn\") pod \"nova-cell1-cell-mapping-5rvmg\" (UID: \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\") " pod="openstack/nova-cell1-cell-mapping-5rvmg" Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.512727 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5rvmg" Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.845014 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e359faf0-2c40-41d4-bb39-4e5ff997d3c0","Type":"ContainerStarted","Data":"25f13a68e9efeb8bc79744f6f8a5480bcb897abb9dfd37ea61afe8be2a672920"} Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.845701 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e359faf0-2c40-41d4-bb39-4e5ff997d3c0","Type":"ContainerStarted","Data":"c2afcf64b0eec5727a064aee50437bbadbc5123c41d214d243d304b75332780e"} Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.845718 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e359faf0-2c40-41d4-bb39-4e5ff997d3c0","Type":"ContainerStarted","Data":"3d96cd0f6ed18c163ac206bd2051f49bbf99fa8d172a4db0c967f5f5f93ecc99"} Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.867772 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.867750255 podStartE2EDuration="2.867750255s" podCreationTimestamp="2026-03-13 09:34:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:34:37.861779639 +0000 UTC m=+1360.591679840" watchObservedRunningTime="2026-03-13 09:34:37.867750255 +0000 UTC m=+1360.597650446" Mar 13 09:34:37 crc kubenswrapper[4841]: I0313 09:34:37.992098 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5rvmg"] Mar 13 09:34:37 crc kubenswrapper[4841]: W0313 09:34:37.993247 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf39f2a9_0603_43b5_b8e0_8ed87e304c05.slice/crio-c4d30a976622f83043d5672f690b9677350a6c2f06b40ef9fc0a7f7f06853abf WatchSource:0}: Error finding container c4d30a976622f83043d5672f690b9677350a6c2f06b40ef9fc0a7f7f06853abf: Status 404 returned error can't find the container with id c4d30a976622f83043d5672f690b9677350a6c2f06b40ef9fc0a7f7f06853abf Mar 13 09:34:38 crc kubenswrapper[4841]: I0313 09:34:38.856185 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5rvmg" event={"ID":"cf39f2a9-0603-43b5-b8e0-8ed87e304c05","Type":"ContainerStarted","Data":"9b41b1fcadfa46b35e1f77a3f8d704fafeefa20d6ade44ee2ada70391a791ab7"} Mar 13 09:34:38 crc kubenswrapper[4841]: I0313 09:34:38.856845 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5rvmg" event={"ID":"cf39f2a9-0603-43b5-b8e0-8ed87e304c05","Type":"ContainerStarted","Data":"c4d30a976622f83043d5672f690b9677350a6c2f06b40ef9fc0a7f7f06853abf"} Mar 13 09:34:38 crc kubenswrapper[4841]: I0313 09:34:38.885250 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-5rvmg" podStartSLOduration=1.885228332 podStartE2EDuration="1.885228332s" podCreationTimestamp="2026-03-13 09:34:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:34:38.875704684 +0000 UTC m=+1361.605604875" watchObservedRunningTime="2026-03-13 09:34:38.885228332 +0000 UTC m=+1361.615128533" Mar 13 09:34:39 crc kubenswrapper[4841]: I0313 09:34:39.243483 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:34:39 crc kubenswrapper[4841]: I0313 09:34:39.317203 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-zjcgr"] Mar 13 09:34:39 crc kubenswrapper[4841]: I0313 09:34:39.317467 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" podUID="670ff361-affa-44f7-b872-303ba17bb4f4" containerName="dnsmasq-dns" containerID="cri-o://0bbf134e0b93cea1efe344431835ec6357d0c9c7cfe326ef5858d3182d6c239f" gracePeriod=10 Mar 13 09:34:39 crc kubenswrapper[4841]: I0313 09:34:39.891191 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402","Type":"ContainerStarted","Data":"fc168b742146385f56cf41a226d1de0a8d2bb92c6a2233ddc3462f89beb9e31d"} Mar 13 09:34:39 crc kubenswrapper[4841]: I0313 09:34:39.894382 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 09:34:39 crc kubenswrapper[4841]: I0313 09:34:39.900733 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerName="ceilometer-central-agent" containerID="cri-o://255f0209ff03332bbade9cc9934cd134a40b830d10c4687630898f2dd52f50b7" gracePeriod=30 Mar 13 09:34:39 crc kubenswrapper[4841]: I0313 09:34:39.900821 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerName="sg-core" containerID="cri-o://facfda0d0243fe6ca1ec9794d38b78f9f1a5433e9db96368c9a0224748579dca" gracePeriod=30 Mar 13 09:34:39 crc kubenswrapper[4841]: I0313 09:34:39.900915 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerName="ceilometer-notification-agent" containerID="cri-o://47eb7f52fbaf44de56febb41ab8af5f76ce5fbe88b47970a101cdf1793b54848" gracePeriod=30 Mar 13 09:34:39 crc kubenswrapper[4841]: I0313 09:34:39.900891 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerName="proxy-httpd" containerID="cri-o://fc168b742146385f56cf41a226d1de0a8d2bb92c6a2233ddc3462f89beb9e31d" gracePeriod=30 Mar 13 09:34:39 crc kubenswrapper[4841]: I0313 09:34:39.907523 4841 generic.go:334] "Generic (PLEG): container finished" podID="670ff361-affa-44f7-b872-303ba17bb4f4" containerID="0bbf134e0b93cea1efe344431835ec6357d0c9c7cfe326ef5858d3182d6c239f" exitCode=0 Mar 13 09:34:39 crc kubenswrapper[4841]: I0313 09:34:39.907803 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" event={"ID":"670ff361-affa-44f7-b872-303ba17bb4f4","Type":"ContainerDied","Data":"0bbf134e0b93cea1efe344431835ec6357d0c9c7cfe326ef5858d3182d6c239f"} Mar 13 09:34:39 crc kubenswrapper[4841]: I0313 09:34:39.908416 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" event={"ID":"670ff361-affa-44f7-b872-303ba17bb4f4","Type":"ContainerDied","Data":"b2414db49ba4594e6e684557f078ce6a7a1c6e380aa4654491ecaf54d454a6f7"} Mar 13 09:34:39 crc kubenswrapper[4841]: I0313 09:34:39.908466 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2414db49ba4594e6e684557f078ce6a7a1c6e380aa4654491ecaf54d454a6f7" Mar 13 09:34:39 crc kubenswrapper[4841]: I0313 09:34:39.945053 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.045912913 podStartE2EDuration="6.945024248s" podCreationTimestamp="2026-03-13 09:34:33 +0000 UTC" firstStartedPulling="2026-03-13 09:34:34.703914322 +0000 UTC m=+1357.433814533" lastFinishedPulling="2026-03-13 09:34:38.603025677 +0000 UTC m=+1361.332925868" observedRunningTime="2026-03-13 09:34:39.922436043 +0000 UTC m=+1362.652336244" watchObservedRunningTime="2026-03-13 09:34:39.945024248 +0000 UTC m=+1362.674924439" Mar 13 09:34:39 crc kubenswrapper[4841]: I0313 09:34:39.961246 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.098966 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-dns-svc\") pod \"670ff361-affa-44f7-b872-303ba17bb4f4\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.099010 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-ovsdbserver-nb\") pod \"670ff361-affa-44f7-b872-303ba17bb4f4\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.099070 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-config\") pod \"670ff361-affa-44f7-b872-303ba17bb4f4\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.099113 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-dns-swift-storage-0\") pod \"670ff361-affa-44f7-b872-303ba17bb4f4\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.099186 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qq8f\" (UniqueName: \"kubernetes.io/projected/670ff361-affa-44f7-b872-303ba17bb4f4-kube-api-access-2qq8f\") pod \"670ff361-affa-44f7-b872-303ba17bb4f4\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.102033 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-ovsdbserver-sb\") pod \"670ff361-affa-44f7-b872-303ba17bb4f4\" (UID: \"670ff361-affa-44f7-b872-303ba17bb4f4\") " Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.107503 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/670ff361-affa-44f7-b872-303ba17bb4f4-kube-api-access-2qq8f" (OuterVolumeSpecName: "kube-api-access-2qq8f") pod "670ff361-affa-44f7-b872-303ba17bb4f4" (UID: "670ff361-affa-44f7-b872-303ba17bb4f4"). InnerVolumeSpecName "kube-api-access-2qq8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.149188 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "670ff361-affa-44f7-b872-303ba17bb4f4" (UID: "670ff361-affa-44f7-b872-303ba17bb4f4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.152482 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "670ff361-affa-44f7-b872-303ba17bb4f4" (UID: "670ff361-affa-44f7-b872-303ba17bb4f4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.153939 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "670ff361-affa-44f7-b872-303ba17bb4f4" (UID: "670ff361-affa-44f7-b872-303ba17bb4f4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.159719 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "670ff361-affa-44f7-b872-303ba17bb4f4" (UID: "670ff361-affa-44f7-b872-303ba17bb4f4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.164960 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-config" (OuterVolumeSpecName: "config") pod "670ff361-affa-44f7-b872-303ba17bb4f4" (UID: "670ff361-affa-44f7-b872-303ba17bb4f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.208083 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.208553 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.208621 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.208635 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.208649 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qq8f\" (UniqueName: \"kubernetes.io/projected/670ff361-affa-44f7-b872-303ba17bb4f4-kube-api-access-2qq8f\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.210224 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/670ff361-affa-44f7-b872-303ba17bb4f4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.924985 4841 generic.go:334] "Generic (PLEG): container finished" podID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerID="fc168b742146385f56cf41a226d1de0a8d2bb92c6a2233ddc3462f89beb9e31d" exitCode=0 Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.925335 4841 generic.go:334] "Generic (PLEG): container finished" podID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerID="facfda0d0243fe6ca1ec9794d38b78f9f1a5433e9db96368c9a0224748579dca" exitCode=2 Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.925345 4841 generic.go:334] "Generic (PLEG): container finished" podID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerID="47eb7f52fbaf44de56febb41ab8af5f76ce5fbe88b47970a101cdf1793b54848" exitCode=0 Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.925352 4841 generic.go:334] "Generic (PLEG): container finished" podID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerID="255f0209ff03332bbade9cc9934cd134a40b830d10c4687630898f2dd52f50b7" exitCode=0 Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.925090 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402","Type":"ContainerDied","Data":"fc168b742146385f56cf41a226d1de0a8d2bb92c6a2233ddc3462f89beb9e31d"} Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.925432 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-zjcgr" Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.925433 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402","Type":"ContainerDied","Data":"facfda0d0243fe6ca1ec9794d38b78f9f1a5433e9db96368c9a0224748579dca"} Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.925446 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402","Type":"ContainerDied","Data":"47eb7f52fbaf44de56febb41ab8af5f76ce5fbe88b47970a101cdf1793b54848"} Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.925457 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402","Type":"ContainerDied","Data":"255f0209ff03332bbade9cc9934cd134a40b830d10c4687630898f2dd52f50b7"} Mar 13 09:34:40 crc kubenswrapper[4841]: I0313 09:34:40.992217 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-zjcgr"] Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.009379 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-zjcgr"] Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.146183 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.330698 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-ceilometer-tls-certs\") pod \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.330771 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-config-data\") pod \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.330833 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-scripts\") pod \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.330868 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-combined-ca-bundle\") pod \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.330896 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-log-httpd\") pod \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.330996 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z86j\" (UniqueName: \"kubernetes.io/projected/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-kube-api-access-5z86j\") pod \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.331107 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-run-httpd\") pod \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.331163 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-sg-core-conf-yaml\") pod \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\" (UID: \"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402\") " Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.335362 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" (UID: "5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.335914 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" (UID: "5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.343110 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-kube-api-access-5z86j" (OuterVolumeSpecName: "kube-api-access-5z86j") pod "5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" (UID: "5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402"). InnerVolumeSpecName "kube-api-access-5z86j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.343207 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-scripts" (OuterVolumeSpecName: "scripts") pod "5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" (UID: "5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.379312 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" (UID: "5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.401995 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" (UID: "5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.432947 4841 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.432980 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.432989 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.432998 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z86j\" (UniqueName: \"kubernetes.io/projected/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-kube-api-access-5z86j\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.433008 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.433016 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.443865 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" (UID: "5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.485503 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-config-data" (OuterVolumeSpecName: "config-data") pod "5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" (UID: "5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.535018 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.535047 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.935514 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402","Type":"ContainerDied","Data":"4019224d832d5f60ebbff563821fb5052b48ee454a7864e087deff89535d5a68"} Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.935559 4841 scope.go:117] "RemoveContainer" containerID="fc168b742146385f56cf41a226d1de0a8d2bb92c6a2233ddc3462f89beb9e31d" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.935678 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.973194 4841 scope.go:117] "RemoveContainer" containerID="facfda0d0243fe6ca1ec9794d38b78f9f1a5433e9db96368c9a0224748579dca" Mar 13 09:34:41 crc kubenswrapper[4841]: I0313 09:34:41.988616 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.009205 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="670ff361-affa-44f7-b872-303ba17bb4f4" path="/var/lib/kubelet/pods/670ff361-affa-44f7-b872-303ba17bb4f4/volumes" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.014472 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.021999 4841 scope.go:117] "RemoveContainer" containerID="47eb7f52fbaf44de56febb41ab8af5f76ce5fbe88b47970a101cdf1793b54848" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.030706 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:34:42 crc kubenswrapper[4841]: E0313 09:34:42.031304 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerName="proxy-httpd" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.031325 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerName="proxy-httpd" Mar 13 09:34:42 crc kubenswrapper[4841]: E0313 09:34:42.031350 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="670ff361-affa-44f7-b872-303ba17bb4f4" containerName="dnsmasq-dns" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.031363 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="670ff361-affa-44f7-b872-303ba17bb4f4" containerName="dnsmasq-dns" Mar 13 09:34:42 crc kubenswrapper[4841]: E0313 09:34:42.031391 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="670ff361-affa-44f7-b872-303ba17bb4f4" containerName="init" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.031403 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="670ff361-affa-44f7-b872-303ba17bb4f4" containerName="init" Mar 13 09:34:42 crc kubenswrapper[4841]: E0313 09:34:42.031436 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerName="ceilometer-notification-agent" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.031460 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerName="ceilometer-notification-agent" Mar 13 09:34:42 crc kubenswrapper[4841]: E0313 09:34:42.031488 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerName="ceilometer-central-agent" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.031503 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerName="ceilometer-central-agent" Mar 13 09:34:42 crc kubenswrapper[4841]: E0313 09:34:42.031538 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerName="sg-core" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.031550 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerName="sg-core" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.031861 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerName="proxy-httpd" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.031889 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerName="sg-core" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.031915 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerName="ceilometer-notification-agent" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.031940 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="670ff361-affa-44f7-b872-303ba17bb4f4" containerName="dnsmasq-dns" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.031970 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" containerName="ceilometer-central-agent" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.036365 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.043887 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.044334 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.044507 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.045870 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.059344 4841 scope.go:117] "RemoveContainer" containerID="255f0209ff03332bbade9cc9934cd134a40b830d10c4687630898f2dd52f50b7" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.146289 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198a2488-dbe2-4045-8346-800c44f750f5-log-httpd\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.146715 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rj94\" (UniqueName: \"kubernetes.io/projected/198a2488-dbe2-4045-8346-800c44f750f5-kube-api-access-6rj94\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.146749 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.146771 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.146798 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-config-data\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.146817 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-scripts\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.146879 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.147093 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198a2488-dbe2-4045-8346-800c44f750f5-run-httpd\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.249228 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rj94\" (UniqueName: \"kubernetes.io/projected/198a2488-dbe2-4045-8346-800c44f750f5-kube-api-access-6rj94\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.249303 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.249335 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.249367 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-config-data\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.249393 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-scripts\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.249432 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.249501 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198a2488-dbe2-4045-8346-800c44f750f5-run-httpd\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.249546 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198a2488-dbe2-4045-8346-800c44f750f5-log-httpd\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.250189 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198a2488-dbe2-4045-8346-800c44f750f5-log-httpd\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.250376 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198a2488-dbe2-4045-8346-800c44f750f5-run-httpd\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.252993 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.253505 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.253695 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.254928 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-config-data\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.266689 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-scripts\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.270139 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rj94\" (UniqueName: \"kubernetes.io/projected/198a2488-dbe2-4045-8346-800c44f750f5-kube-api-access-6rj94\") pod \"ceilometer-0\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.372976 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.885942 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:34:42 crc kubenswrapper[4841]: I0313 09:34:42.954731 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"198a2488-dbe2-4045-8346-800c44f750f5","Type":"ContainerStarted","Data":"5303981305fff448795c4125dc443c685df4d2db12a547d13c0d80e797c10802"} Mar 13 09:34:43 crc kubenswrapper[4841]: I0313 09:34:43.967920 4841 generic.go:334] "Generic (PLEG): container finished" podID="cf39f2a9-0603-43b5-b8e0-8ed87e304c05" containerID="9b41b1fcadfa46b35e1f77a3f8d704fafeefa20d6ade44ee2ada70391a791ab7" exitCode=0 Mar 13 09:34:43 crc kubenswrapper[4841]: I0313 09:34:43.968010 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5rvmg" event={"ID":"cf39f2a9-0603-43b5-b8e0-8ed87e304c05","Type":"ContainerDied","Data":"9b41b1fcadfa46b35e1f77a3f8d704fafeefa20d6ade44ee2ada70391a791ab7"} Mar 13 09:34:44 crc kubenswrapper[4841]: I0313 09:34:44.012118 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402" path="/var/lib/kubelet/pods/5bbfc9f4-d3c9-4dd0-a242-a0e7b889b402/volumes" Mar 13 09:34:44 crc kubenswrapper[4841]: I0313 09:34:44.986895 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"198a2488-dbe2-4045-8346-800c44f750f5","Type":"ContainerStarted","Data":"d01afbba145f30b14fe565f2a4840decce17f7a9eccd56ecf11f8e95e9c5ed10"} Mar 13 09:34:45 crc kubenswrapper[4841]: I0313 09:34:45.303326 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5rvmg" Mar 13 09:34:45 crc kubenswrapper[4841]: I0313 09:34:45.412051 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-config-data\") pod \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\" (UID: \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\") " Mar 13 09:34:45 crc kubenswrapper[4841]: I0313 09:34:45.412284 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmpsn\" (UniqueName: \"kubernetes.io/projected/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-kube-api-access-hmpsn\") pod \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\" (UID: \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\") " Mar 13 09:34:45 crc kubenswrapper[4841]: I0313 09:34:45.412419 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-scripts\") pod \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\" (UID: \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\") " Mar 13 09:34:45 crc kubenswrapper[4841]: I0313 09:34:45.412453 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-combined-ca-bundle\") pod \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\" (UID: \"cf39f2a9-0603-43b5-b8e0-8ed87e304c05\") " Mar 13 09:34:45 crc kubenswrapper[4841]: I0313 09:34:45.416404 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-scripts" (OuterVolumeSpecName: "scripts") pod "cf39f2a9-0603-43b5-b8e0-8ed87e304c05" (UID: "cf39f2a9-0603-43b5-b8e0-8ed87e304c05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:45 crc kubenswrapper[4841]: I0313 09:34:45.416794 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-kube-api-access-hmpsn" (OuterVolumeSpecName: "kube-api-access-hmpsn") pod "cf39f2a9-0603-43b5-b8e0-8ed87e304c05" (UID: "cf39f2a9-0603-43b5-b8e0-8ed87e304c05"). InnerVolumeSpecName "kube-api-access-hmpsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:34:45 crc kubenswrapper[4841]: I0313 09:34:45.446480 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-config-data" (OuterVolumeSpecName: "config-data") pod "cf39f2a9-0603-43b5-b8e0-8ed87e304c05" (UID: "cf39f2a9-0603-43b5-b8e0-8ed87e304c05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:45 crc kubenswrapper[4841]: I0313 09:34:45.448452 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf39f2a9-0603-43b5-b8e0-8ed87e304c05" (UID: "cf39f2a9-0603-43b5-b8e0-8ed87e304c05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:45 crc kubenswrapper[4841]: I0313 09:34:45.514681 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:45 crc kubenswrapper[4841]: I0313 09:34:45.514727 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:45 crc kubenswrapper[4841]: I0313 09:34:45.514749 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:45 crc kubenswrapper[4841]: I0313 09:34:45.514762 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmpsn\" (UniqueName: \"kubernetes.io/projected/cf39f2a9-0603-43b5-b8e0-8ed87e304c05-kube-api-access-hmpsn\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:46 crc kubenswrapper[4841]: I0313 09:34:46.003665 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5rvmg" Mar 13 09:34:46 crc kubenswrapper[4841]: I0313 09:34:46.008965 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"198a2488-dbe2-4045-8346-800c44f750f5","Type":"ContainerStarted","Data":"55d23c2ce883f55d02551502e6aaa93a91d6c4d5ab1834e34149d797af1e85de"} Mar 13 09:34:46 crc kubenswrapper[4841]: I0313 09:34:46.009005 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5rvmg" event={"ID":"cf39f2a9-0603-43b5-b8e0-8ed87e304c05","Type":"ContainerDied","Data":"c4d30a976622f83043d5672f690b9677350a6c2f06b40ef9fc0a7f7f06853abf"} Mar 13 09:34:46 crc kubenswrapper[4841]: I0313 09:34:46.009026 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4d30a976622f83043d5672f690b9677350a6c2f06b40ef9fc0a7f7f06853abf" Mar 13 09:34:46 crc kubenswrapper[4841]: I0313 09:34:46.171039 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 09:34:46 crc kubenswrapper[4841]: I0313 09:34:46.171311 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e359faf0-2c40-41d4-bb39-4e5ff997d3c0" containerName="nova-api-log" containerID="cri-o://c2afcf64b0eec5727a064aee50437bbadbc5123c41d214d243d304b75332780e" gracePeriod=30 Mar 13 09:34:46 crc kubenswrapper[4841]: I0313 09:34:46.171410 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e359faf0-2c40-41d4-bb39-4e5ff997d3c0" containerName="nova-api-api" containerID="cri-o://25f13a68e9efeb8bc79744f6f8a5480bcb897abb9dfd37ea61afe8be2a672920" gracePeriod=30 Mar 13 09:34:46 crc kubenswrapper[4841]: I0313 09:34:46.205904 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 09:34:46 crc kubenswrapper[4841]: I0313 09:34:46.206702 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="903d509d-35b9-43f6-9e55-0da3e0d628cb" containerName="nova-scheduler-scheduler" containerID="cri-o://79f12ce641025e7db8a65f88336b638d0da83efa2e8fdd5d01bf8e8d366948dd" gracePeriod=30 Mar 13 09:34:46 crc kubenswrapper[4841]: I0313 09:34:46.218139 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:34:46 crc kubenswrapper[4841]: I0313 09:34:46.218421 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" containerName="nova-metadata-log" containerID="cri-o://cb85a0d9ddf7885c6c797ff30dba64a75afba066f50fe9270905427d4c30e9fe" gracePeriod=30 Mar 13 09:34:46 crc kubenswrapper[4841]: I0313 09:34:46.218847 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" containerName="nova-metadata-metadata" containerID="cri-o://393320de975d567c3900e1c4c84401c42d8cb0431be77d46457b7092d50e146d" gracePeriod=30 Mar 13 09:34:46 crc kubenswrapper[4841]: I0313 09:34:46.895360 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.026874 4841 generic.go:334] "Generic (PLEG): container finished" podID="e359faf0-2c40-41d4-bb39-4e5ff997d3c0" containerID="25f13a68e9efeb8bc79744f6f8a5480bcb897abb9dfd37ea61afe8be2a672920" exitCode=0 Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.026911 4841 generic.go:334] "Generic (PLEG): container finished" podID="e359faf0-2c40-41d4-bb39-4e5ff997d3c0" containerID="c2afcf64b0eec5727a064aee50437bbadbc5123c41d214d243d304b75332780e" exitCode=143 Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.026969 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e359faf0-2c40-41d4-bb39-4e5ff997d3c0","Type":"ContainerDied","Data":"25f13a68e9efeb8bc79744f6f8a5480bcb897abb9dfd37ea61afe8be2a672920"} Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.026994 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.027020 4841 scope.go:117] "RemoveContainer" containerID="25f13a68e9efeb8bc79744f6f8a5480bcb897abb9dfd37ea61afe8be2a672920" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.027007 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e359faf0-2c40-41d4-bb39-4e5ff997d3c0","Type":"ContainerDied","Data":"c2afcf64b0eec5727a064aee50437bbadbc5123c41d214d243d304b75332780e"} Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.027341 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e359faf0-2c40-41d4-bb39-4e5ff997d3c0","Type":"ContainerDied","Data":"3d96cd0f6ed18c163ac206bd2051f49bbf99fa8d172a4db0c967f5f5f93ecc99"} Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.031514 4841 generic.go:334] "Generic (PLEG): container finished" podID="4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" containerID="cb85a0d9ddf7885c6c797ff30dba64a75afba066f50fe9270905427d4c30e9fe" exitCode=143 Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.031602 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093","Type":"ContainerDied","Data":"cb85a0d9ddf7885c6c797ff30dba64a75afba066f50fe9270905427d4c30e9fe"} Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.034002 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"198a2488-dbe2-4045-8346-800c44f750f5","Type":"ContainerStarted","Data":"2e4097706ffba97714052d8f658e5be9111dce5b385085cca5b96432f508fdeb"} Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.054621 4841 scope.go:117] "RemoveContainer" containerID="c2afcf64b0eec5727a064aee50437bbadbc5123c41d214d243d304b75332780e" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.057301 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-combined-ca-bundle\") pod \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.057355 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-logs\") pod \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.057381 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-public-tls-certs\") pod \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.057416 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpf4p\" (UniqueName: \"kubernetes.io/projected/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-kube-api-access-lpf4p\") pod \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.057519 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-internal-tls-certs\") pod \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.057688 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-config-data\") pod \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\" (UID: \"e359faf0-2c40-41d4-bb39-4e5ff997d3c0\") " Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.058124 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-logs" (OuterVolumeSpecName: "logs") pod "e359faf0-2c40-41d4-bb39-4e5ff997d3c0" (UID: "e359faf0-2c40-41d4-bb39-4e5ff997d3c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:34:47 crc kubenswrapper[4841]: E0313 09:34:47.068498 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="79f12ce641025e7db8a65f88336b638d0da83efa2e8fdd5d01bf8e8d366948dd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 09:34:47 crc kubenswrapper[4841]: E0313 09:34:47.075875 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="79f12ce641025e7db8a65f88336b638d0da83efa2e8fdd5d01bf8e8d366948dd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.076714 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-kube-api-access-lpf4p" (OuterVolumeSpecName: "kube-api-access-lpf4p") pod "e359faf0-2c40-41d4-bb39-4e5ff997d3c0" (UID: "e359faf0-2c40-41d4-bb39-4e5ff997d3c0"). InnerVolumeSpecName "kube-api-access-lpf4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:34:47 crc kubenswrapper[4841]: E0313 09:34:47.077909 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="79f12ce641025e7db8a65f88336b638d0da83efa2e8fdd5d01bf8e8d366948dd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 09:34:47 crc kubenswrapper[4841]: E0313 09:34:47.078031 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="903d509d-35b9-43f6-9e55-0da3e0d628cb" containerName="nova-scheduler-scheduler" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.078375 4841 scope.go:117] "RemoveContainer" containerID="25f13a68e9efeb8bc79744f6f8a5480bcb897abb9dfd37ea61afe8be2a672920" Mar 13 09:34:47 crc kubenswrapper[4841]: E0313 09:34:47.079895 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f13a68e9efeb8bc79744f6f8a5480bcb897abb9dfd37ea61afe8be2a672920\": container with ID starting with 25f13a68e9efeb8bc79744f6f8a5480bcb897abb9dfd37ea61afe8be2a672920 not found: ID does not exist" containerID="25f13a68e9efeb8bc79744f6f8a5480bcb897abb9dfd37ea61afe8be2a672920" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.080010 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f13a68e9efeb8bc79744f6f8a5480bcb897abb9dfd37ea61afe8be2a672920"} err="failed to get container status \"25f13a68e9efeb8bc79744f6f8a5480bcb897abb9dfd37ea61afe8be2a672920\": rpc error: code = NotFound desc = could not find container \"25f13a68e9efeb8bc79744f6f8a5480bcb897abb9dfd37ea61afe8be2a672920\": container with ID starting with 25f13a68e9efeb8bc79744f6f8a5480bcb897abb9dfd37ea61afe8be2a672920 not found: ID does not exist" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.080117 4841 scope.go:117] "RemoveContainer" containerID="c2afcf64b0eec5727a064aee50437bbadbc5123c41d214d243d304b75332780e" Mar 13 09:34:47 crc kubenswrapper[4841]: E0313 09:34:47.082806 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2afcf64b0eec5727a064aee50437bbadbc5123c41d214d243d304b75332780e\": container with ID starting with c2afcf64b0eec5727a064aee50437bbadbc5123c41d214d243d304b75332780e not found: ID does not exist" containerID="c2afcf64b0eec5727a064aee50437bbadbc5123c41d214d243d304b75332780e" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.082897 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2afcf64b0eec5727a064aee50437bbadbc5123c41d214d243d304b75332780e"} err="failed to get container status \"c2afcf64b0eec5727a064aee50437bbadbc5123c41d214d243d304b75332780e\": rpc error: code = NotFound desc = could not find container \"c2afcf64b0eec5727a064aee50437bbadbc5123c41d214d243d304b75332780e\": container with ID starting with c2afcf64b0eec5727a064aee50437bbadbc5123c41d214d243d304b75332780e not found: ID does not exist" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.082991 4841 scope.go:117] "RemoveContainer" containerID="25f13a68e9efeb8bc79744f6f8a5480bcb897abb9dfd37ea61afe8be2a672920" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.086490 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f13a68e9efeb8bc79744f6f8a5480bcb897abb9dfd37ea61afe8be2a672920"} err="failed to get container status \"25f13a68e9efeb8bc79744f6f8a5480bcb897abb9dfd37ea61afe8be2a672920\": rpc error: code = NotFound desc = could not find container \"25f13a68e9efeb8bc79744f6f8a5480bcb897abb9dfd37ea61afe8be2a672920\": container with ID starting with 25f13a68e9efeb8bc79744f6f8a5480bcb897abb9dfd37ea61afe8be2a672920 not found: ID does not exist" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.086603 4841 scope.go:117] "RemoveContainer" containerID="c2afcf64b0eec5727a064aee50437bbadbc5123c41d214d243d304b75332780e" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.087296 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2afcf64b0eec5727a064aee50437bbadbc5123c41d214d243d304b75332780e"} err="failed to get container status \"c2afcf64b0eec5727a064aee50437bbadbc5123c41d214d243d304b75332780e\": rpc error: code = NotFound desc = could not find container \"c2afcf64b0eec5727a064aee50437bbadbc5123c41d214d243d304b75332780e\": container with ID starting with c2afcf64b0eec5727a064aee50437bbadbc5123c41d214d243d304b75332780e not found: ID does not exist" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.089671 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e359faf0-2c40-41d4-bb39-4e5ff997d3c0" (UID: "e359faf0-2c40-41d4-bb39-4e5ff997d3c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.104962 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-config-data" (OuterVolumeSpecName: "config-data") pod "e359faf0-2c40-41d4-bb39-4e5ff997d3c0" (UID: "e359faf0-2c40-41d4-bb39-4e5ff997d3c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.116129 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e359faf0-2c40-41d4-bb39-4e5ff997d3c0" (UID: "e359faf0-2c40-41d4-bb39-4e5ff997d3c0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.122013 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e359faf0-2c40-41d4-bb39-4e5ff997d3c0" (UID: "e359faf0-2c40-41d4-bb39-4e5ff997d3c0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.160061 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.160091 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-logs\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.160100 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.160111 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpf4p\" (UniqueName: \"kubernetes.io/projected/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-kube-api-access-lpf4p\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.160119 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.160127 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e359faf0-2c40-41d4-bb39-4e5ff997d3c0-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.400995 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.423732 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.441321 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 09:34:47 crc kubenswrapper[4841]: E0313 09:34:47.441816 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e359faf0-2c40-41d4-bb39-4e5ff997d3c0" containerName="nova-api-api" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.441839 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e359faf0-2c40-41d4-bb39-4e5ff997d3c0" containerName="nova-api-api" Mar 13 09:34:47 crc kubenswrapper[4841]: E0313 09:34:47.441862 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e359faf0-2c40-41d4-bb39-4e5ff997d3c0" containerName="nova-api-log" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.441869 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e359faf0-2c40-41d4-bb39-4e5ff997d3c0" containerName="nova-api-log" Mar 13 09:34:47 crc kubenswrapper[4841]: E0313 09:34:47.441913 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf39f2a9-0603-43b5-b8e0-8ed87e304c05" containerName="nova-manage" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.441921 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf39f2a9-0603-43b5-b8e0-8ed87e304c05" containerName="nova-manage" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.442117 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e359faf0-2c40-41d4-bb39-4e5ff997d3c0" containerName="nova-api-api" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.442149 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf39f2a9-0603-43b5-b8e0-8ed87e304c05" containerName="nova-manage" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.442163 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e359faf0-2c40-41d4-bb39-4e5ff997d3c0" containerName="nova-api-log" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.443758 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.445945 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.447129 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.447283 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.450373 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.566416 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7666c55c-f424-4bfd-a143-e768e534b721-config-data\") pod \"nova-api-0\" (UID: \"7666c55c-f424-4bfd-a143-e768e534b721\") " pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.566471 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7666c55c-f424-4bfd-a143-e768e534b721-public-tls-certs\") pod \"nova-api-0\" (UID: \"7666c55c-f424-4bfd-a143-e768e534b721\") " pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.566505 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7666c55c-f424-4bfd-a143-e768e534b721-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7666c55c-f424-4bfd-a143-e768e534b721\") " pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.566777 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7666c55c-f424-4bfd-a143-e768e534b721-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7666c55c-f424-4bfd-a143-e768e534b721\") " pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.566844 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7666c55c-f424-4bfd-a143-e768e534b721-logs\") pod \"nova-api-0\" (UID: \"7666c55c-f424-4bfd-a143-e768e534b721\") " pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.566885 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvwq8\" (UniqueName: \"kubernetes.io/projected/7666c55c-f424-4bfd-a143-e768e534b721-kube-api-access-xvwq8\") pod \"nova-api-0\" (UID: \"7666c55c-f424-4bfd-a143-e768e534b721\") " pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.669213 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7666c55c-f424-4bfd-a143-e768e534b721-public-tls-certs\") pod \"nova-api-0\" (UID: \"7666c55c-f424-4bfd-a143-e768e534b721\") " pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.669312 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7666c55c-f424-4bfd-a143-e768e534b721-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7666c55c-f424-4bfd-a143-e768e534b721\") " pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.669449 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7666c55c-f424-4bfd-a143-e768e534b721-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7666c55c-f424-4bfd-a143-e768e534b721\") " pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.669476 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7666c55c-f424-4bfd-a143-e768e534b721-logs\") pod \"nova-api-0\" (UID: \"7666c55c-f424-4bfd-a143-e768e534b721\") " pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.669500 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvwq8\" (UniqueName: \"kubernetes.io/projected/7666c55c-f424-4bfd-a143-e768e534b721-kube-api-access-xvwq8\") pod \"nova-api-0\" (UID: \"7666c55c-f424-4bfd-a143-e768e534b721\") " pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.669596 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7666c55c-f424-4bfd-a143-e768e534b721-config-data\") pod \"nova-api-0\" (UID: \"7666c55c-f424-4bfd-a143-e768e534b721\") " pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.670357 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7666c55c-f424-4bfd-a143-e768e534b721-logs\") pod \"nova-api-0\" (UID: \"7666c55c-f424-4bfd-a143-e768e534b721\") " pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.673795 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7666c55c-f424-4bfd-a143-e768e534b721-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7666c55c-f424-4bfd-a143-e768e534b721\") " pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.673836 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7666c55c-f424-4bfd-a143-e768e534b721-public-tls-certs\") pod \"nova-api-0\" (UID: \"7666c55c-f424-4bfd-a143-e768e534b721\") " pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.679001 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7666c55c-f424-4bfd-a143-e768e534b721-config-data\") pod \"nova-api-0\" (UID: \"7666c55c-f424-4bfd-a143-e768e534b721\") " pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.688042 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7666c55c-f424-4bfd-a143-e768e534b721-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7666c55c-f424-4bfd-a143-e768e534b721\") " pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.688425 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvwq8\" (UniqueName: \"kubernetes.io/projected/7666c55c-f424-4bfd-a143-e768e534b721-kube-api-access-xvwq8\") pod \"nova-api-0\" (UID: \"7666c55c-f424-4bfd-a143-e768e534b721\") " pod="openstack/nova-api-0" Mar 13 09:34:47 crc kubenswrapper[4841]: I0313 09:34:47.762097 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 09:34:48 crc kubenswrapper[4841]: I0313 09:34:48.019705 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e359faf0-2c40-41d4-bb39-4e5ff997d3c0" path="/var/lib/kubelet/pods/e359faf0-2c40-41d4-bb39-4e5ff997d3c0/volumes" Mar 13 09:34:48 crc kubenswrapper[4841]: I0313 09:34:48.223848 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 09:34:49 crc kubenswrapper[4841]: I0313 09:34:49.117009 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"198a2488-dbe2-4045-8346-800c44f750f5","Type":"ContainerStarted","Data":"16649722d35459ff69d48a89727c97d468607dd329e060f36b7739ea916e9ea4"} Mar 13 09:34:49 crc kubenswrapper[4841]: I0313 09:34:49.117818 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 09:34:49 crc kubenswrapper[4841]: I0313 09:34:49.120067 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7666c55c-f424-4bfd-a143-e768e534b721","Type":"ContainerStarted","Data":"0477585e04b6d2bb244312ab0c34adbc8c8abf1270f1df507e372c680251119e"} Mar 13 09:34:49 crc kubenswrapper[4841]: I0313 09:34:49.120094 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7666c55c-f424-4bfd-a143-e768e534b721","Type":"ContainerStarted","Data":"f1c4357e9225727cb8af9eecf9c7960906cf5fe6f475e697c4354f345b5097b5"} Mar 13 09:34:49 crc kubenswrapper[4841]: I0313 09:34:49.120586 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7666c55c-f424-4bfd-a143-e768e534b721","Type":"ContainerStarted","Data":"8715331397dc98c83b3e668e5e1f8b42967367f4b9d99739f67f8da73e74f4cc"} Mar 13 09:34:49 crc kubenswrapper[4841]: I0313 09:34:49.153458 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.689969171 podStartE2EDuration="8.153432865s" podCreationTimestamp="2026-03-13 09:34:41 +0000 UTC" firstStartedPulling="2026-03-13 09:34:42.893743329 +0000 UTC m=+1365.623643520" lastFinishedPulling="2026-03-13 09:34:48.357207023 +0000 UTC m=+1371.087107214" observedRunningTime="2026-03-13 09:34:49.1375727 +0000 UTC m=+1371.867472891" watchObservedRunningTime="2026-03-13 09:34:49.153432865 +0000 UTC m=+1371.883333056" Mar 13 09:34:49 crc kubenswrapper[4841]: I0313 09:34:49.174787 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.1747681500000002 podStartE2EDuration="2.17476815s" podCreationTimestamp="2026-03-13 09:34:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:34:49.162020923 +0000 UTC m=+1371.891921134" watchObservedRunningTime="2026-03-13 09:34:49.17476815 +0000 UTC m=+1371.904668341" Mar 13 09:34:49 crc kubenswrapper[4841]: I0313 09:34:49.534931 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": read tcp 10.217.0.2:36646->10.217.0.211:8775: read: connection reset by peer" Mar 13 09:34:49 crc kubenswrapper[4841]: I0313 09:34:49.534931 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": read tcp 10.217.0.2:36630->10.217.0.211:8775: read: connection reset by peer" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.054202 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.132397 4841 generic.go:334] "Generic (PLEG): container finished" podID="4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" containerID="393320de975d567c3900e1c4c84401c42d8cb0431be77d46457b7092d50e146d" exitCode=0 Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.132455 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.132538 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093","Type":"ContainerDied","Data":"393320de975d567c3900e1c4c84401c42d8cb0431be77d46457b7092d50e146d"} Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.132607 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093","Type":"ContainerDied","Data":"6053b95fd0de2d3d9325fce52f3e6d9daac1e411625cadb686f997f6302481dc"} Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.132637 4841 scope.go:117] "RemoveContainer" containerID="393320de975d567c3900e1c4c84401c42d8cb0431be77d46457b7092d50e146d" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.154077 4841 scope.go:117] "RemoveContainer" containerID="cb85a0d9ddf7885c6c797ff30dba64a75afba066f50fe9270905427d4c30e9fe" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.174153 4841 scope.go:117] "RemoveContainer" containerID="393320de975d567c3900e1c4c84401c42d8cb0431be77d46457b7092d50e146d" Mar 13 09:34:50 crc kubenswrapper[4841]: E0313 09:34:50.174613 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"393320de975d567c3900e1c4c84401c42d8cb0431be77d46457b7092d50e146d\": container with ID starting with 393320de975d567c3900e1c4c84401c42d8cb0431be77d46457b7092d50e146d not found: ID does not exist" containerID="393320de975d567c3900e1c4c84401c42d8cb0431be77d46457b7092d50e146d" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.174649 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"393320de975d567c3900e1c4c84401c42d8cb0431be77d46457b7092d50e146d"} err="failed to get container status \"393320de975d567c3900e1c4c84401c42d8cb0431be77d46457b7092d50e146d\": rpc error: code = NotFound desc = could not find container \"393320de975d567c3900e1c4c84401c42d8cb0431be77d46457b7092d50e146d\": container with ID starting with 393320de975d567c3900e1c4c84401c42d8cb0431be77d46457b7092d50e146d not found: ID does not exist" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.174687 4841 scope.go:117] "RemoveContainer" containerID="cb85a0d9ddf7885c6c797ff30dba64a75afba066f50fe9270905427d4c30e9fe" Mar 13 09:34:50 crc kubenswrapper[4841]: E0313 09:34:50.174907 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb85a0d9ddf7885c6c797ff30dba64a75afba066f50fe9270905427d4c30e9fe\": container with ID starting with cb85a0d9ddf7885c6c797ff30dba64a75afba066f50fe9270905427d4c30e9fe not found: ID does not exist" containerID="cb85a0d9ddf7885c6c797ff30dba64a75afba066f50fe9270905427d4c30e9fe" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.174938 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb85a0d9ddf7885c6c797ff30dba64a75afba066f50fe9270905427d4c30e9fe"} err="failed to get container status \"cb85a0d9ddf7885c6c797ff30dba64a75afba066f50fe9270905427d4c30e9fe\": rpc error: code = NotFound desc = could not find container \"cb85a0d9ddf7885c6c797ff30dba64a75afba066f50fe9270905427d4c30e9fe\": container with ID starting with cb85a0d9ddf7885c6c797ff30dba64a75afba066f50fe9270905427d4c30e9fe not found: ID does not exist" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.236544 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-config-data\") pod \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.236638 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-logs\") pod \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.236863 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-combined-ca-bundle\") pod \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.237069 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-nova-metadata-tls-certs\") pod \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.237132 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkmwb\" (UniqueName: \"kubernetes.io/projected/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-kube-api-access-zkmwb\") pod \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\" (UID: \"4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093\") " Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.237216 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-logs" (OuterVolumeSpecName: "logs") pod "4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" (UID: "4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.238253 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-logs\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.241728 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-kube-api-access-zkmwb" (OuterVolumeSpecName: "kube-api-access-zkmwb") pod "4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" (UID: "4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093"). InnerVolumeSpecName "kube-api-access-zkmwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.262044 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" (UID: "4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.278972 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-config-data" (OuterVolumeSpecName: "config-data") pod "4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" (UID: "4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.283965 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" (UID: "4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.341396 4841 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.341834 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkmwb\" (UniqueName: \"kubernetes.io/projected/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-kube-api-access-zkmwb\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.341845 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.341855 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.465691 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.481175 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.493315 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:34:50 crc kubenswrapper[4841]: E0313 09:34:50.493910 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" containerName="nova-metadata-log" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.493937 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" containerName="nova-metadata-log" Mar 13 09:34:50 crc kubenswrapper[4841]: E0313 09:34:50.493964 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" containerName="nova-metadata-metadata" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.493978 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" containerName="nova-metadata-metadata" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.494378 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" containerName="nova-metadata-metadata" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.494424 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" containerName="nova-metadata-log" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.496100 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.498564 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.498704 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.513131 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.545402 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ede0b909-963d-4110-8d86-b09095cbd08c-logs\") pod \"nova-metadata-0\" (UID: \"ede0b909-963d-4110-8d86-b09095cbd08c\") " pod="openstack/nova-metadata-0" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.545466 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ede0b909-963d-4110-8d86-b09095cbd08c-config-data\") pod \"nova-metadata-0\" (UID: \"ede0b909-963d-4110-8d86-b09095cbd08c\") " pod="openstack/nova-metadata-0" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.545514 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ede0b909-963d-4110-8d86-b09095cbd08c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ede0b909-963d-4110-8d86-b09095cbd08c\") " pod="openstack/nova-metadata-0" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.545584 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede0b909-963d-4110-8d86-b09095cbd08c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ede0b909-963d-4110-8d86-b09095cbd08c\") " pod="openstack/nova-metadata-0" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.545629 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2qsp\" (UniqueName: \"kubernetes.io/projected/ede0b909-963d-4110-8d86-b09095cbd08c-kube-api-access-t2qsp\") pod \"nova-metadata-0\" (UID: \"ede0b909-963d-4110-8d86-b09095cbd08c\") " pod="openstack/nova-metadata-0" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.646960 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ede0b909-963d-4110-8d86-b09095cbd08c-logs\") pod \"nova-metadata-0\" (UID: \"ede0b909-963d-4110-8d86-b09095cbd08c\") " pod="openstack/nova-metadata-0" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.647017 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ede0b909-963d-4110-8d86-b09095cbd08c-config-data\") pod \"nova-metadata-0\" (UID: \"ede0b909-963d-4110-8d86-b09095cbd08c\") " pod="openstack/nova-metadata-0" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.647065 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ede0b909-963d-4110-8d86-b09095cbd08c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ede0b909-963d-4110-8d86-b09095cbd08c\") " pod="openstack/nova-metadata-0" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.647128 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede0b909-963d-4110-8d86-b09095cbd08c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ede0b909-963d-4110-8d86-b09095cbd08c\") " pod="openstack/nova-metadata-0" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.647165 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2qsp\" (UniqueName: \"kubernetes.io/projected/ede0b909-963d-4110-8d86-b09095cbd08c-kube-api-access-t2qsp\") pod \"nova-metadata-0\" (UID: \"ede0b909-963d-4110-8d86-b09095cbd08c\") " pod="openstack/nova-metadata-0" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.647455 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ede0b909-963d-4110-8d86-b09095cbd08c-logs\") pod \"nova-metadata-0\" (UID: \"ede0b909-963d-4110-8d86-b09095cbd08c\") " pod="openstack/nova-metadata-0" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.652412 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ede0b909-963d-4110-8d86-b09095cbd08c-config-data\") pod \"nova-metadata-0\" (UID: \"ede0b909-963d-4110-8d86-b09095cbd08c\") " pod="openstack/nova-metadata-0" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.652787 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ede0b909-963d-4110-8d86-b09095cbd08c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ede0b909-963d-4110-8d86-b09095cbd08c\") " pod="openstack/nova-metadata-0" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.661894 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ede0b909-963d-4110-8d86-b09095cbd08c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ede0b909-963d-4110-8d86-b09095cbd08c\") " pod="openstack/nova-metadata-0" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.665634 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2qsp\" (UniqueName: \"kubernetes.io/projected/ede0b909-963d-4110-8d86-b09095cbd08c-kube-api-access-t2qsp\") pod \"nova-metadata-0\" (UID: \"ede0b909-963d-4110-8d86-b09095cbd08c\") " pod="openstack/nova-metadata-0" Mar 13 09:34:50 crc kubenswrapper[4841]: I0313 09:34:50.820784 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 09:34:51 crc kubenswrapper[4841]: I0313 09:34:51.146407 4841 generic.go:334] "Generic (PLEG): container finished" podID="903d509d-35b9-43f6-9e55-0da3e0d628cb" containerID="79f12ce641025e7db8a65f88336b638d0da83efa2e8fdd5d01bf8e8d366948dd" exitCode=0 Mar 13 09:34:51 crc kubenswrapper[4841]: I0313 09:34:51.147062 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"903d509d-35b9-43f6-9e55-0da3e0d628cb","Type":"ContainerDied","Data":"79f12ce641025e7db8a65f88336b638d0da83efa2e8fdd5d01bf8e8d366948dd"} Mar 13 09:34:51 crc kubenswrapper[4841]: I0313 09:34:51.167651 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 09:34:51 crc kubenswrapper[4841]: I0313 09:34:51.359309 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwx7f\" (UniqueName: \"kubernetes.io/projected/903d509d-35b9-43f6-9e55-0da3e0d628cb-kube-api-access-hwx7f\") pod \"903d509d-35b9-43f6-9e55-0da3e0d628cb\" (UID: \"903d509d-35b9-43f6-9e55-0da3e0d628cb\") " Mar 13 09:34:51 crc kubenswrapper[4841]: I0313 09:34:51.359381 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/903d509d-35b9-43f6-9e55-0da3e0d628cb-combined-ca-bundle\") pod \"903d509d-35b9-43f6-9e55-0da3e0d628cb\" (UID: \"903d509d-35b9-43f6-9e55-0da3e0d628cb\") " Mar 13 09:34:51 crc kubenswrapper[4841]: I0313 09:34:51.359639 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/903d509d-35b9-43f6-9e55-0da3e0d628cb-config-data\") pod \"903d509d-35b9-43f6-9e55-0da3e0d628cb\" (UID: \"903d509d-35b9-43f6-9e55-0da3e0d628cb\") " Mar 13 09:34:51 crc kubenswrapper[4841]: I0313 09:34:51.367705 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/903d509d-35b9-43f6-9e55-0da3e0d628cb-kube-api-access-hwx7f" (OuterVolumeSpecName: "kube-api-access-hwx7f") pod "903d509d-35b9-43f6-9e55-0da3e0d628cb" (UID: "903d509d-35b9-43f6-9e55-0da3e0d628cb"). InnerVolumeSpecName "kube-api-access-hwx7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:34:51 crc kubenswrapper[4841]: I0313 09:34:51.383482 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 09:34:51 crc kubenswrapper[4841]: I0313 09:34:51.407435 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903d509d-35b9-43f6-9e55-0da3e0d628cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "903d509d-35b9-43f6-9e55-0da3e0d628cb" (UID: "903d509d-35b9-43f6-9e55-0da3e0d628cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:51 crc kubenswrapper[4841]: I0313 09:34:51.409100 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903d509d-35b9-43f6-9e55-0da3e0d628cb-config-data" (OuterVolumeSpecName: "config-data") pod "903d509d-35b9-43f6-9e55-0da3e0d628cb" (UID: "903d509d-35b9-43f6-9e55-0da3e0d628cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:34:51 crc kubenswrapper[4841]: I0313 09:34:51.462010 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwx7f\" (UniqueName: \"kubernetes.io/projected/903d509d-35b9-43f6-9e55-0da3e0d628cb-kube-api-access-hwx7f\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:51 crc kubenswrapper[4841]: I0313 09:34:51.462045 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/903d509d-35b9-43f6-9e55-0da3e0d628cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:51 crc kubenswrapper[4841]: I0313 09:34:51.462054 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/903d509d-35b9-43f6-9e55-0da3e0d628cb-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.007839 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093" path="/var/lib/kubelet/pods/4cf6e9ec-e2c2-4bbf-bb4b-8e905c964093/volumes" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.157746 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ede0b909-963d-4110-8d86-b09095cbd08c","Type":"ContainerStarted","Data":"00347fce616a2dbea9c4bd0c2fbad1c49c8944f671f700e7bc16b388ba191f4c"} Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.157797 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ede0b909-963d-4110-8d86-b09095cbd08c","Type":"ContainerStarted","Data":"809dad49771351edd8375ec4409692adc6ea68fd2bc390e813e993a09c464c06"} Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.157812 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ede0b909-963d-4110-8d86-b09095cbd08c","Type":"ContainerStarted","Data":"84386e657b26cd382000289b3ba54838dfed92407676bf29dc598f281e1948ae"} Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.159751 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"903d509d-35b9-43f6-9e55-0da3e0d628cb","Type":"ContainerDied","Data":"c5a27e4894c976e148727dce446bb05d552aec1a526b6f09d20ffc2a98cef5f0"} Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.159784 4841 scope.go:117] "RemoveContainer" containerID="79f12ce641025e7db8a65f88336b638d0da83efa2e8fdd5d01bf8e8d366948dd" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.159819 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.193830 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.193805646 podStartE2EDuration="2.193805646s" podCreationTimestamp="2026-03-13 09:34:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:34:52.179075936 +0000 UTC m=+1374.908976137" watchObservedRunningTime="2026-03-13 09:34:52.193805646 +0000 UTC m=+1374.923705837" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.247752 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.256117 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.266228 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 09:34:52 crc kubenswrapper[4841]: E0313 09:34:52.266619 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="903d509d-35b9-43f6-9e55-0da3e0d628cb" containerName="nova-scheduler-scheduler" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.266636 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="903d509d-35b9-43f6-9e55-0da3e0d628cb" containerName="nova-scheduler-scheduler" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.267549 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="903d509d-35b9-43f6-9e55-0da3e0d628cb" containerName="nova-scheduler-scheduler" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.268297 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.278306 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.278465 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.380369 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fdad05-7998-4fe3-a774-61cdaa01e27f-config-data\") pod \"nova-scheduler-0\" (UID: \"d3fdad05-7998-4fe3-a774-61cdaa01e27f\") " pod="openstack/nova-scheduler-0" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.380441 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wprk\" (UniqueName: \"kubernetes.io/projected/d3fdad05-7998-4fe3-a774-61cdaa01e27f-kube-api-access-8wprk\") pod \"nova-scheduler-0\" (UID: \"d3fdad05-7998-4fe3-a774-61cdaa01e27f\") " pod="openstack/nova-scheduler-0" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.380475 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fdad05-7998-4fe3-a774-61cdaa01e27f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d3fdad05-7998-4fe3-a774-61cdaa01e27f\") " pod="openstack/nova-scheduler-0" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.483482 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fdad05-7998-4fe3-a774-61cdaa01e27f-config-data\") pod \"nova-scheduler-0\" (UID: \"d3fdad05-7998-4fe3-a774-61cdaa01e27f\") " pod="openstack/nova-scheduler-0" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.483986 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wprk\" (UniqueName: \"kubernetes.io/projected/d3fdad05-7998-4fe3-a774-61cdaa01e27f-kube-api-access-8wprk\") pod \"nova-scheduler-0\" (UID: \"d3fdad05-7998-4fe3-a774-61cdaa01e27f\") " pod="openstack/nova-scheduler-0" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.484196 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fdad05-7998-4fe3-a774-61cdaa01e27f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d3fdad05-7998-4fe3-a774-61cdaa01e27f\") " pod="openstack/nova-scheduler-0" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.495426 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fdad05-7998-4fe3-a774-61cdaa01e27f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d3fdad05-7998-4fe3-a774-61cdaa01e27f\") " pod="openstack/nova-scheduler-0" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.495442 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fdad05-7998-4fe3-a774-61cdaa01e27f-config-data\") pod \"nova-scheduler-0\" (UID: \"d3fdad05-7998-4fe3-a774-61cdaa01e27f\") " pod="openstack/nova-scheduler-0" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.509740 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wprk\" (UniqueName: \"kubernetes.io/projected/d3fdad05-7998-4fe3-a774-61cdaa01e27f-kube-api-access-8wprk\") pod \"nova-scheduler-0\" (UID: \"d3fdad05-7998-4fe3-a774-61cdaa01e27f\") " pod="openstack/nova-scheduler-0" Mar 13 09:34:52 crc kubenswrapper[4841]: I0313 09:34:52.600929 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 09:34:53 crc kubenswrapper[4841]: I0313 09:34:53.106359 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 09:34:53 crc kubenswrapper[4841]: I0313 09:34:53.171426 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d3fdad05-7998-4fe3-a774-61cdaa01e27f","Type":"ContainerStarted","Data":"a46d95cc1b8bc09abdcfe9671a941eda7eaeea8ea521b551e7c1554cd88330a7"} Mar 13 09:34:54 crc kubenswrapper[4841]: I0313 09:34:54.014131 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="903d509d-35b9-43f6-9e55-0da3e0d628cb" path="/var/lib/kubelet/pods/903d509d-35b9-43f6-9e55-0da3e0d628cb/volumes" Mar 13 09:34:54 crc kubenswrapper[4841]: I0313 09:34:54.199449 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d3fdad05-7998-4fe3-a774-61cdaa01e27f","Type":"ContainerStarted","Data":"1c6af7ab5cac5a744472212e081ca567e42e80a6b6ce2c77f0609f02f6b8343d"} Mar 13 09:34:54 crc kubenswrapper[4841]: I0313 09:34:54.225374 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.225351871 podStartE2EDuration="2.225351871s" podCreationTimestamp="2026-03-13 09:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:34:54.220127808 +0000 UTC m=+1376.950028009" watchObservedRunningTime="2026-03-13 09:34:54.225351871 +0000 UTC m=+1376.955252072" Mar 13 09:34:55 crc kubenswrapper[4841]: I0313 09:34:55.820905 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 09:34:55 crc kubenswrapper[4841]: I0313 09:34:55.821208 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 09:34:57 crc kubenswrapper[4841]: I0313 09:34:57.602188 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 09:34:57 crc kubenswrapper[4841]: I0313 09:34:57.763343 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 09:34:57 crc kubenswrapper[4841]: I0313 09:34:57.763766 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 09:34:58 crc kubenswrapper[4841]: I0313 09:34:58.778413 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7666c55c-f424-4bfd-a143-e768e534b721" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 09:34:58 crc kubenswrapper[4841]: I0313 09:34:58.778465 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7666c55c-f424-4bfd-a143-e768e534b721" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 09:34:59 crc kubenswrapper[4841]: I0313 09:34:59.980019 4841 scope.go:117] "RemoveContainer" containerID="b93289622c1c831313f1861efcd99f106369f240a30eeb9fc573964c93ecc6b3" Mar 13 09:35:00 crc kubenswrapper[4841]: I0313 09:35:00.821459 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 09:35:00 crc kubenswrapper[4841]: I0313 09:35:00.822317 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 09:35:01 crc kubenswrapper[4841]: I0313 09:35:01.837683 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ede0b909-963d-4110-8d86-b09095cbd08c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.223:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 09:35:01 crc kubenswrapper[4841]: I0313 09:35:01.837744 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ede0b909-963d-4110-8d86-b09095cbd08c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.223:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 09:35:02 crc kubenswrapper[4841]: I0313 09:35:02.601564 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 09:35:02 crc kubenswrapper[4841]: I0313 09:35:02.626599 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 09:35:03 crc kubenswrapper[4841]: I0313 09:35:03.369319 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 09:35:04 crc kubenswrapper[4841]: I0313 09:35:04.407565 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:35:04 crc kubenswrapper[4841]: I0313 09:35:04.407643 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:35:07 crc kubenswrapper[4841]: I0313 09:35:07.774946 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 09:35:07 crc kubenswrapper[4841]: I0313 09:35:07.775980 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 09:35:07 crc kubenswrapper[4841]: I0313 09:35:07.776690 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 09:35:07 crc kubenswrapper[4841]: I0313 09:35:07.784571 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 09:35:08 crc kubenswrapper[4841]: I0313 09:35:08.391542 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 09:35:08 crc kubenswrapper[4841]: I0313 09:35:08.410255 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 09:35:10 crc kubenswrapper[4841]: I0313 09:35:10.826482 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 09:35:10 crc kubenswrapper[4841]: I0313 09:35:10.830355 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 09:35:10 crc kubenswrapper[4841]: I0313 09:35:10.832434 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 09:35:11 crc kubenswrapper[4841]: I0313 09:35:11.424916 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 09:35:12 crc kubenswrapper[4841]: I0313 09:35:12.393881 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 09:35:23 crc kubenswrapper[4841]: I0313 09:35:23.455797 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 09:35:24 crc kubenswrapper[4841]: I0313 09:35:24.458450 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 09:35:27 crc kubenswrapper[4841]: I0313 09:35:27.923586 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="6f270332-4a01-403b-8c06-0f8c0bff6527" containerName="rabbitmq" containerID="cri-o://8eb6ff886c84daefc19ad53048a04087959845a3c06e98ed0685d2cf5ed764ab" gracePeriod=604796 Mar 13 09:35:28 crc kubenswrapper[4841]: I0313 09:35:28.772816 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="ea6882c8-841d-4ca7-90a9-3d16c4303a58" containerName="rabbitmq" containerID="cri-o://d484f76415e52edb8485e15e2d614b7c52bd0a121cded2bea9a89604a513cb21" gracePeriod=604796 Mar 13 09:35:30 crc kubenswrapper[4841]: I0313 09:35:30.562341 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="ea6882c8-841d-4ca7-90a9-3d16c4303a58" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 13 09:35:30 crc kubenswrapper[4841]: I0313 09:35:30.821950 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6f270332-4a01-403b-8c06-0f8c0bff6527" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.407508 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.408157 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.408219 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.409153 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"775ebab3cf7b982d36c777cc0cdaea2069ca71dd3ee3f41b99a1b2505417aae0"} pod="openshift-machine-config-operator/machine-config-daemon-h227v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.409210 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" containerID="cri-o://775ebab3cf7b982d36c777cc0cdaea2069ca71dd3ee3f41b99a1b2505417aae0" gracePeriod=600 Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.707408 4841 generic.go:334] "Generic (PLEG): container finished" podID="6f270332-4a01-403b-8c06-0f8c0bff6527" containerID="8eb6ff886c84daefc19ad53048a04087959845a3c06e98ed0685d2cf5ed764ab" exitCode=0 Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.707476 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6f270332-4a01-403b-8c06-0f8c0bff6527","Type":"ContainerDied","Data":"8eb6ff886c84daefc19ad53048a04087959845a3c06e98ed0685d2cf5ed764ab"} Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.707759 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6f270332-4a01-403b-8c06-0f8c0bff6527","Type":"ContainerDied","Data":"1190f6327df593c8b43d5c515c3a5002bbe6ab8ca9a6f88326c69354f0aefef9"} Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.707777 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1190f6327df593c8b43d5c515c3a5002bbe6ab8ca9a6f88326c69354f0aefef9" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.710342 4841 generic.go:334] "Generic (PLEG): container finished" podID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerID="775ebab3cf7b982d36c777cc0cdaea2069ca71dd3ee3f41b99a1b2505417aae0" exitCode=0 Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.710384 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerDied","Data":"775ebab3cf7b982d36c777cc0cdaea2069ca71dd3ee3f41b99a1b2505417aae0"} Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.710424 4841 scope.go:117] "RemoveContainer" containerID="8dc018ca0f90a95ed2ddf32ba76ace2f8d1b621b17b9ee14fcc045e6a5af19f7" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.737904 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.890052 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-tls\") pod \"6f270332-4a01-403b-8c06-0f8c0bff6527\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.890100 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f270332-4a01-403b-8c06-0f8c0bff6527-server-conf\") pod \"6f270332-4a01-403b-8c06-0f8c0bff6527\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.890157 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f270332-4a01-403b-8c06-0f8c0bff6527-plugins-conf\") pod \"6f270332-4a01-403b-8c06-0f8c0bff6527\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.890193 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f270332-4a01-403b-8c06-0f8c0bff6527-config-data\") pod \"6f270332-4a01-403b-8c06-0f8c0bff6527\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.890305 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f270332-4a01-403b-8c06-0f8c0bff6527-pod-info\") pod \"6f270332-4a01-403b-8c06-0f8c0bff6527\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.891000 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-confd\") pod \"6f270332-4a01-403b-8c06-0f8c0bff6527\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.891036 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-plugins\") pod \"6f270332-4a01-403b-8c06-0f8c0bff6527\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.891124 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26lqr\" (UniqueName: \"kubernetes.io/projected/6f270332-4a01-403b-8c06-0f8c0bff6527-kube-api-access-26lqr\") pod \"6f270332-4a01-403b-8c06-0f8c0bff6527\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.891145 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-erlang-cookie\") pod \"6f270332-4a01-403b-8c06-0f8c0bff6527\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.891166 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f270332-4a01-403b-8c06-0f8c0bff6527-erlang-cookie-secret\") pod \"6f270332-4a01-403b-8c06-0f8c0bff6527\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.891235 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"6f270332-4a01-403b-8c06-0f8c0bff6527\" (UID: \"6f270332-4a01-403b-8c06-0f8c0bff6527\") " Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.890861 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f270332-4a01-403b-8c06-0f8c0bff6527-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6f270332-4a01-403b-8c06-0f8c0bff6527" (UID: "6f270332-4a01-403b-8c06-0f8c0bff6527"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.891410 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6f270332-4a01-403b-8c06-0f8c0bff6527" (UID: "6f270332-4a01-403b-8c06-0f8c0bff6527"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.891746 4841 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f270332-4a01-403b-8c06-0f8c0bff6527-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.891764 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.891787 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6f270332-4a01-403b-8c06-0f8c0bff6527" (UID: "6f270332-4a01-403b-8c06-0f8c0bff6527"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.895882 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "6f270332-4a01-403b-8c06-0f8c0bff6527" (UID: "6f270332-4a01-403b-8c06-0f8c0bff6527"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.898430 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f270332-4a01-403b-8c06-0f8c0bff6527-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6f270332-4a01-403b-8c06-0f8c0bff6527" (UID: "6f270332-4a01-403b-8c06-0f8c0bff6527"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.898474 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f270332-4a01-403b-8c06-0f8c0bff6527-kube-api-access-26lqr" (OuterVolumeSpecName: "kube-api-access-26lqr") pod "6f270332-4a01-403b-8c06-0f8c0bff6527" (UID: "6f270332-4a01-403b-8c06-0f8c0bff6527"). InnerVolumeSpecName "kube-api-access-26lqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.898640 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6f270332-4a01-403b-8c06-0f8c0bff6527-pod-info" (OuterVolumeSpecName: "pod-info") pod "6f270332-4a01-403b-8c06-0f8c0bff6527" (UID: "6f270332-4a01-403b-8c06-0f8c0bff6527"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.902487 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6f270332-4a01-403b-8c06-0f8c0bff6527" (UID: "6f270332-4a01-403b-8c06-0f8c0bff6527"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.928939 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f270332-4a01-403b-8c06-0f8c0bff6527-config-data" (OuterVolumeSpecName: "config-data") pod "6f270332-4a01-403b-8c06-0f8c0bff6527" (UID: "6f270332-4a01-403b-8c06-0f8c0bff6527"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.987828 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f270332-4a01-403b-8c06-0f8c0bff6527-server-conf" (OuterVolumeSpecName: "server-conf") pod "6f270332-4a01-403b-8c06-0f8c0bff6527" (UID: "6f270332-4a01-403b-8c06-0f8c0bff6527"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.993345 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.993461 4841 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f270332-4a01-403b-8c06-0f8c0bff6527-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.993550 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.993605 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.993654 4841 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f270332-4a01-403b-8c06-0f8c0bff6527-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.993702 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f270332-4a01-403b-8c06-0f8c0bff6527-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.993758 4841 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f270332-4a01-403b-8c06-0f8c0bff6527-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:34 crc kubenswrapper[4841]: I0313 09:35:34.993808 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26lqr\" (UniqueName: \"kubernetes.io/projected/6f270332-4a01-403b-8c06-0f8c0bff6527-kube-api-access-26lqr\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.012936 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.037832 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6f270332-4a01-403b-8c06-0f8c0bff6527" (UID: "6f270332-4a01-403b-8c06-0f8c0bff6527"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.138690 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f270332-4a01-403b-8c06-0f8c0bff6527-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.138902 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.383749 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.547254 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-tls\") pod \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.547359 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-plugins\") pod \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.547416 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ea6882c8-841d-4ca7-90a9-3d16c4303a58-erlang-cookie-secret\") pod \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.547452 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jksrs\" (UniqueName: \"kubernetes.io/projected/ea6882c8-841d-4ca7-90a9-3d16c4303a58-kube-api-access-jksrs\") pod \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.547485 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-erlang-cookie\") pod \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.547513 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-confd\") pod \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.547537 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ea6882c8-841d-4ca7-90a9-3d16c4303a58-server-conf\") pod \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.547563 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.547586 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ea6882c8-841d-4ca7-90a9-3d16c4303a58-plugins-conf\") pod \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.547654 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ea6882c8-841d-4ca7-90a9-3d16c4303a58-pod-info\") pod \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.547668 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea6882c8-841d-4ca7-90a9-3d16c4303a58-config-data\") pod \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\" (UID: \"ea6882c8-841d-4ca7-90a9-3d16c4303a58\") " Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.548053 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ea6882c8-841d-4ca7-90a9-3d16c4303a58" (UID: "ea6882c8-841d-4ca7-90a9-3d16c4303a58"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.548496 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ea6882c8-841d-4ca7-90a9-3d16c4303a58" (UID: "ea6882c8-841d-4ca7-90a9-3d16c4303a58"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.548801 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea6882c8-841d-4ca7-90a9-3d16c4303a58-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ea6882c8-841d-4ca7-90a9-3d16c4303a58" (UID: "ea6882c8-841d-4ca7-90a9-3d16c4303a58"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.555520 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ea6882c8-841d-4ca7-90a9-3d16c4303a58" (UID: "ea6882c8-841d-4ca7-90a9-3d16c4303a58"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.555662 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "ea6882c8-841d-4ca7-90a9-3d16c4303a58" (UID: "ea6882c8-841d-4ca7-90a9-3d16c4303a58"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.560455 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea6882c8-841d-4ca7-90a9-3d16c4303a58-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ea6882c8-841d-4ca7-90a9-3d16c4303a58" (UID: "ea6882c8-841d-4ca7-90a9-3d16c4303a58"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.560482 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea6882c8-841d-4ca7-90a9-3d16c4303a58-kube-api-access-jksrs" (OuterVolumeSpecName: "kube-api-access-jksrs") pod "ea6882c8-841d-4ca7-90a9-3d16c4303a58" (UID: "ea6882c8-841d-4ca7-90a9-3d16c4303a58"). InnerVolumeSpecName "kube-api-access-jksrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.592382 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ea6882c8-841d-4ca7-90a9-3d16c4303a58-pod-info" (OuterVolumeSpecName: "pod-info") pod "ea6882c8-841d-4ca7-90a9-3d16c4303a58" (UID: "ea6882c8-841d-4ca7-90a9-3d16c4303a58"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.609957 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea6882c8-841d-4ca7-90a9-3d16c4303a58-config-data" (OuterVolumeSpecName: "config-data") pod "ea6882c8-841d-4ca7-90a9-3d16c4303a58" (UID: "ea6882c8-841d-4ca7-90a9-3d16c4303a58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.643909 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea6882c8-841d-4ca7-90a9-3d16c4303a58-server-conf" (OuterVolumeSpecName: "server-conf") pod "ea6882c8-841d-4ca7-90a9-3d16c4303a58" (UID: "ea6882c8-841d-4ca7-90a9-3d16c4303a58"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.649540 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.649578 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.649589 4841 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ea6882c8-841d-4ca7-90a9-3d16c4303a58-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.649602 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jksrs\" (UniqueName: \"kubernetes.io/projected/ea6882c8-841d-4ca7-90a9-3d16c4303a58-kube-api-access-jksrs\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.649615 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.649626 4841 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ea6882c8-841d-4ca7-90a9-3d16c4303a58-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.649656 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.649666 4841 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ea6882c8-841d-4ca7-90a9-3d16c4303a58-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.649675 4841 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ea6882c8-841d-4ca7-90a9-3d16c4303a58-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.649684 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea6882c8-841d-4ca7-90a9-3d16c4303a58-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.717873 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.749815 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab"} Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.751936 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.763414 4841 generic.go:334] "Generic (PLEG): container finished" podID="ea6882c8-841d-4ca7-90a9-3d16c4303a58" containerID="d484f76415e52edb8485e15e2d614b7c52bd0a121cded2bea9a89604a513cb21" exitCode=0 Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.763502 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.765547 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.766078 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ea6882c8-841d-4ca7-90a9-3d16c4303a58","Type":"ContainerDied","Data":"d484f76415e52edb8485e15e2d614b7c52bd0a121cded2bea9a89604a513cb21"} Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.766120 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ea6882c8-841d-4ca7-90a9-3d16c4303a58","Type":"ContainerDied","Data":"a8bf9f4a1d7c5ce6f2b2f79889b5a705194970fffad32e026ce632065f9bc6f3"} Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.766151 4841 scope.go:117] "RemoveContainer" containerID="d484f76415e52edb8485e15e2d614b7c52bd0a121cded2bea9a89604a513cb21" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.786321 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ea6882c8-841d-4ca7-90a9-3d16c4303a58" (UID: "ea6882c8-841d-4ca7-90a9-3d16c4303a58"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.810032 4841 scope.go:117] "RemoveContainer" containerID="723c75ed37f7f534e4700be3916499abdbad4a5a30193aa4fdacace8cd9cf2e3" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.832324 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.843086 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.854574 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ea6882c8-841d-4ca7-90a9-3d16c4303a58-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.860710 4841 scope.go:117] "RemoveContainer" containerID="d484f76415e52edb8485e15e2d614b7c52bd0a121cded2bea9a89604a513cb21" Mar 13 09:35:35 crc kubenswrapper[4841]: E0313 09:35:35.861921 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d484f76415e52edb8485e15e2d614b7c52bd0a121cded2bea9a89604a513cb21\": container with ID starting with d484f76415e52edb8485e15e2d614b7c52bd0a121cded2bea9a89604a513cb21 not found: ID does not exist" containerID="d484f76415e52edb8485e15e2d614b7c52bd0a121cded2bea9a89604a513cb21" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.861948 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d484f76415e52edb8485e15e2d614b7c52bd0a121cded2bea9a89604a513cb21"} err="failed to get container status \"d484f76415e52edb8485e15e2d614b7c52bd0a121cded2bea9a89604a513cb21\": rpc error: code = NotFound desc = could not find container \"d484f76415e52edb8485e15e2d614b7c52bd0a121cded2bea9a89604a513cb21\": container with ID starting with d484f76415e52edb8485e15e2d614b7c52bd0a121cded2bea9a89604a513cb21 not found: ID does not exist" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.861969 4841 scope.go:117] "RemoveContainer" containerID="723c75ed37f7f534e4700be3916499abdbad4a5a30193aa4fdacace8cd9cf2e3" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.863008 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 09:35:35 crc kubenswrapper[4841]: E0313 09:35:35.865385 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f270332-4a01-403b-8c06-0f8c0bff6527" containerName="rabbitmq" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.865410 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f270332-4a01-403b-8c06-0f8c0bff6527" containerName="rabbitmq" Mar 13 09:35:35 crc kubenswrapper[4841]: E0313 09:35:35.865439 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6882c8-841d-4ca7-90a9-3d16c4303a58" containerName="setup-container" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.865445 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6882c8-841d-4ca7-90a9-3d16c4303a58" containerName="setup-container" Mar 13 09:35:35 crc kubenswrapper[4841]: E0313 09:35:35.865459 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f270332-4a01-403b-8c06-0f8c0bff6527" containerName="setup-container" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.865465 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f270332-4a01-403b-8c06-0f8c0bff6527" containerName="setup-container" Mar 13 09:35:35 crc kubenswrapper[4841]: E0313 09:35:35.865479 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6882c8-841d-4ca7-90a9-3d16c4303a58" containerName="rabbitmq" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.865485 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6882c8-841d-4ca7-90a9-3d16c4303a58" containerName="rabbitmq" Mar 13 09:35:35 crc kubenswrapper[4841]: E0313 09:35:35.865605 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"723c75ed37f7f534e4700be3916499abdbad4a5a30193aa4fdacace8cd9cf2e3\": container with ID starting with 723c75ed37f7f534e4700be3916499abdbad4a5a30193aa4fdacace8cd9cf2e3 not found: ID does not exist" containerID="723c75ed37f7f534e4700be3916499abdbad4a5a30193aa4fdacace8cd9cf2e3" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.865644 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"723c75ed37f7f534e4700be3916499abdbad4a5a30193aa4fdacace8cd9cf2e3"} err="failed to get container status \"723c75ed37f7f534e4700be3916499abdbad4a5a30193aa4fdacace8cd9cf2e3\": rpc error: code = NotFound desc = could not find container \"723c75ed37f7f534e4700be3916499abdbad4a5a30193aa4fdacace8cd9cf2e3\": container with ID starting with 723c75ed37f7f534e4700be3916499abdbad4a5a30193aa4fdacace8cd9cf2e3 not found: ID does not exist" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.865702 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f270332-4a01-403b-8c06-0f8c0bff6527" containerName="rabbitmq" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.865725 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea6882c8-841d-4ca7-90a9-3d16c4303a58" containerName="rabbitmq" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.866643 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.868751 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.868930 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.869492 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.869656 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-l6rdh" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.869811 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.869940 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.870161 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.901924 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.955643 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.955703 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.955934 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.956020 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-config-data\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.956092 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.956185 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.956237 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfbcj\" (UniqueName: \"kubernetes.io/projected/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-kube-api-access-mfbcj\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.956360 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.956463 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.956521 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:35 crc kubenswrapper[4841]: I0313 09:35:35.956570 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.006539 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f270332-4a01-403b-8c06-0f8c0bff6527" path="/var/lib/kubelet/pods/6f270332-4a01-403b-8c06-0f8c0bff6527/volumes" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.058361 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.058429 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.058454 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.058488 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.058539 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.058574 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.058615 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.058640 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-config-data\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.058657 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.058683 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.058701 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfbcj\" (UniqueName: \"kubernetes.io/projected/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-kube-api-access-mfbcj\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.059097 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.059915 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.060171 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.060253 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-config-data\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.060616 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.060708 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.064865 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.065660 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.065990 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.067357 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.085468 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfbcj\" (UniqueName: \"kubernetes.io/projected/b5efe6ff-d5eb-4fa9-9496-1838d05f625a-kube-api-access-mfbcj\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.098114 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.105298 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.119283 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"b5efe6ff-d5eb-4fa9-9496-1838d05f625a\") " pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.140425 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.143350 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.147044 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.147255 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-s4lsp" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.147443 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.147585 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.147698 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.147818 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.147996 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.167486 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.191484 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.263740 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.263790 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8lzz\" (UniqueName: \"kubernetes.io/projected/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-kube-api-access-p8lzz\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.263851 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.263874 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.263892 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.263921 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.263945 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.263992 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.264035 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.264059 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.264087 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.365442 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.365737 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.365764 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.365796 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.365839 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.365858 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8lzz\" (UniqueName: \"kubernetes.io/projected/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-kube-api-access-p8lzz\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.365905 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.365924 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.365939 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.365964 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.365985 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.366299 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.366480 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.367986 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.368728 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.369076 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.370812 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.371475 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.372686 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.376841 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.384388 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8lzz\" (UniqueName: \"kubernetes.io/projected/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-kube-api-access-p8lzz\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.402987 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/469aec79-a7a3-4ae1-b00a-94f47a6d4df9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.407434 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"469aec79-a7a3-4ae1-b00a-94f47a6d4df9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.462811 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.701797 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.787813 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b5efe6ff-d5eb-4fa9-9496-1838d05f625a","Type":"ContainerStarted","Data":"5941e73e0d35047131e99ec8adc30c2872e393ec6df68cdc06a48740dd06c765"} Mar 13 09:35:36 crc kubenswrapper[4841]: W0313 09:35:36.884896 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod469aec79_a7a3_4ae1_b00a_94f47a6d4df9.slice/crio-fd666382841ee6d7c8e8131617ca898d53d004fc6955a061c6d808bb51a0ffad WatchSource:0}: Error finding container fd666382841ee6d7c8e8131617ca898d53d004fc6955a061c6d808bb51a0ffad: Status 404 returned error can't find the container with id fd666382841ee6d7c8e8131617ca898d53d004fc6955a061c6d808bb51a0ffad Mar 13 09:35:36 crc kubenswrapper[4841]: I0313 09:35:36.886858 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 09:35:37 crc kubenswrapper[4841]: I0313 09:35:37.797362 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"469aec79-a7a3-4ae1-b00a-94f47a6d4df9","Type":"ContainerStarted","Data":"fd666382841ee6d7c8e8131617ca898d53d004fc6955a061c6d808bb51a0ffad"} Mar 13 09:35:38 crc kubenswrapper[4841]: I0313 09:35:38.012841 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea6882c8-841d-4ca7-90a9-3d16c4303a58" path="/var/lib/kubelet/pods/ea6882c8-841d-4ca7-90a9-3d16c4303a58/volumes" Mar 13 09:35:38 crc kubenswrapper[4841]: I0313 09:35:38.810905 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b5efe6ff-d5eb-4fa9-9496-1838d05f625a","Type":"ContainerStarted","Data":"ab1859a05716e393a13502a6db4e331fd2f27aecab80e3ed29795da8eebbfbec"} Mar 13 09:35:38 crc kubenswrapper[4841]: I0313 09:35:38.815561 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"469aec79-a7a3-4ae1-b00a-94f47a6d4df9","Type":"ContainerStarted","Data":"3be35f73422e347e4521efbc6546a1e15efdd364bef88204fa8b32a4ab56c400"} Mar 13 09:35:40 crc kubenswrapper[4841]: I0313 09:35:40.958574 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-wzvkv"] Mar 13 09:35:40 crc kubenswrapper[4841]: I0313 09:35:40.961258 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:40 crc kubenswrapper[4841]: I0313 09:35:40.963719 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 13 09:35:40 crc kubenswrapper[4841]: I0313 09:35:40.969964 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-wzvkv"] Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.059291 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44phh\" (UniqueName: \"kubernetes.io/projected/d622a2f4-f5a6-4999-9395-329efdeabb2d-kube-api-access-44phh\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.059350 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.059392 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.059668 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.059839 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-config\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.059984 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.060181 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.161680 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.162056 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44phh\" (UniqueName: \"kubernetes.io/projected/d622a2f4-f5a6-4999-9395-329efdeabb2d-kube-api-access-44phh\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.162095 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.162136 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.162192 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.162231 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-config\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.162316 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.163447 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.163487 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.163561 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.163579 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-config\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.164036 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.164093 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.182558 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44phh\" (UniqueName: \"kubernetes.io/projected/d622a2f4-f5a6-4999-9395-329efdeabb2d-kube-api-access-44phh\") pod \"dnsmasq-dns-7d84b4d45c-wzvkv\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.301220 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:41 crc kubenswrapper[4841]: W0313 09:35:41.789234 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd622a2f4_f5a6_4999_9395_329efdeabb2d.slice/crio-253f0c157b4085ba092af452101f228d3584f712c5d58d0a980ec3fdd9c4266b WatchSource:0}: Error finding container 253f0c157b4085ba092af452101f228d3584f712c5d58d0a980ec3fdd9c4266b: Status 404 returned error can't find the container with id 253f0c157b4085ba092af452101f228d3584f712c5d58d0a980ec3fdd9c4266b Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.791513 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-wzvkv"] Mar 13 09:35:41 crc kubenswrapper[4841]: I0313 09:35:41.855642 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" event={"ID":"d622a2f4-f5a6-4999-9395-329efdeabb2d","Type":"ContainerStarted","Data":"253f0c157b4085ba092af452101f228d3584f712c5d58d0a980ec3fdd9c4266b"} Mar 13 09:35:42 crc kubenswrapper[4841]: I0313 09:35:42.868875 4841 generic.go:334] "Generic (PLEG): container finished" podID="d622a2f4-f5a6-4999-9395-329efdeabb2d" containerID="008416c013d24f9cecd11dcbc34bb2d7aef651fc5fadd9258ba36a9599104389" exitCode=0 Mar 13 09:35:42 crc kubenswrapper[4841]: I0313 09:35:42.868924 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" event={"ID":"d622a2f4-f5a6-4999-9395-329efdeabb2d","Type":"ContainerDied","Data":"008416c013d24f9cecd11dcbc34bb2d7aef651fc5fadd9258ba36a9599104389"} Mar 13 09:35:43 crc kubenswrapper[4841]: I0313 09:35:43.883654 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" event={"ID":"d622a2f4-f5a6-4999-9395-329efdeabb2d","Type":"ContainerStarted","Data":"7385fda1e7040574f56bf3d6013ddb93e411c159cd890703c7d93ceccc0d7307"} Mar 13 09:35:43 crc kubenswrapper[4841]: I0313 09:35:43.884167 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:43 crc kubenswrapper[4841]: I0313 09:35:43.914761 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" podStartSLOduration=3.914729223 podStartE2EDuration="3.914729223s" podCreationTimestamp="2026-03-13 09:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:35:43.90534055 +0000 UTC m=+1426.635240741" watchObservedRunningTime="2026-03-13 09:35:43.914729223 +0000 UTC m=+1426.644629414" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.303343 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.383588 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7"] Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.383884 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" podUID="f0ee1a9a-154c-4c27-b964-94a9a13761e6" containerName="dnsmasq-dns" containerID="cri-o://02891e9fbf9cbcc1dc7c603e4f651eaa523df9e408523cfa4d91084e62782e9e" gracePeriod=10 Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.548412 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-6rf6l"] Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.552955 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.592870 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-6rf6l"] Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.645540 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c632011-0a35-4eaa-a7f5-8d86466858ca-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.645778 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c632011-0a35-4eaa-a7f5-8d86466858ca-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.646056 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bn7q\" (UniqueName: \"kubernetes.io/projected/2c632011-0a35-4eaa-a7f5-8d86466858ca-kube-api-access-7bn7q\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.646080 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c632011-0a35-4eaa-a7f5-8d86466858ca-config\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.646142 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2c632011-0a35-4eaa-a7f5-8d86466858ca-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.646184 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c632011-0a35-4eaa-a7f5-8d86466858ca-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.646326 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c632011-0a35-4eaa-a7f5-8d86466858ca-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.747868 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c632011-0a35-4eaa-a7f5-8d86466858ca-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.747967 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c632011-0a35-4eaa-a7f5-8d86466858ca-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.748032 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c632011-0a35-4eaa-a7f5-8d86466858ca-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.748123 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bn7q\" (UniqueName: \"kubernetes.io/projected/2c632011-0a35-4eaa-a7f5-8d86466858ca-kube-api-access-7bn7q\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.748154 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c632011-0a35-4eaa-a7f5-8d86466858ca-config\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.748182 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2c632011-0a35-4eaa-a7f5-8d86466858ca-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.748210 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c632011-0a35-4eaa-a7f5-8d86466858ca-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.749069 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c632011-0a35-4eaa-a7f5-8d86466858ca-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.749303 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c632011-0a35-4eaa-a7f5-8d86466858ca-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.749406 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c632011-0a35-4eaa-a7f5-8d86466858ca-config\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.749448 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c632011-0a35-4eaa-a7f5-8d86466858ca-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.749461 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c632011-0a35-4eaa-a7f5-8d86466858ca-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.749664 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2c632011-0a35-4eaa-a7f5-8d86466858ca-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.770043 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bn7q\" (UniqueName: \"kubernetes.io/projected/2c632011-0a35-4eaa-a7f5-8d86466858ca-kube-api-access-7bn7q\") pod \"dnsmasq-dns-6f6df4f56c-6rf6l\" (UID: \"2c632011-0a35-4eaa-a7f5-8d86466858ca\") " pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.876738 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.896191 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.950945 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-dns-svc\") pod \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.951212 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-ovsdbserver-sb\") pod \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.951354 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bnmq\" (UniqueName: \"kubernetes.io/projected/f0ee1a9a-154c-4c27-b964-94a9a13761e6-kube-api-access-8bnmq\") pod \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.951396 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-config\") pod \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.951451 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-ovsdbserver-nb\") pod \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.951514 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-dns-swift-storage-0\") pod \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\" (UID: \"f0ee1a9a-154c-4c27-b964-94a9a13761e6\") " Mar 13 09:35:51 crc kubenswrapper[4841]: I0313 09:35:51.957779 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0ee1a9a-154c-4c27-b964-94a9a13761e6-kube-api-access-8bnmq" (OuterVolumeSpecName: "kube-api-access-8bnmq") pod "f0ee1a9a-154c-4c27-b964-94a9a13761e6" (UID: "f0ee1a9a-154c-4c27-b964-94a9a13761e6"). InnerVolumeSpecName "kube-api-access-8bnmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.016203 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f0ee1a9a-154c-4c27-b964-94a9a13761e6" (UID: "f0ee1a9a-154c-4c27-b964-94a9a13761e6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.018668 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f0ee1a9a-154c-4c27-b964-94a9a13761e6" (UID: "f0ee1a9a-154c-4c27-b964-94a9a13761e6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.028750 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f0ee1a9a-154c-4c27-b964-94a9a13761e6" (UID: "f0ee1a9a-154c-4c27-b964-94a9a13761e6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.029193 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f0ee1a9a-154c-4c27-b964-94a9a13761e6" (UID: "f0ee1a9a-154c-4c27-b964-94a9a13761e6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.054626 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.054658 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.054667 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.054676 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bnmq\" (UniqueName: \"kubernetes.io/projected/f0ee1a9a-154c-4c27-b964-94a9a13761e6-kube-api-access-8bnmq\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.054686 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.063340 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-config" (OuterVolumeSpecName: "config") pod "f0ee1a9a-154c-4c27-b964-94a9a13761e6" (UID: "f0ee1a9a-154c-4c27-b964-94a9a13761e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.107367 4841 generic.go:334] "Generic (PLEG): container finished" podID="f0ee1a9a-154c-4c27-b964-94a9a13761e6" containerID="02891e9fbf9cbcc1dc7c603e4f651eaa523df9e408523cfa4d91084e62782e9e" exitCode=0 Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.107414 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" event={"ID":"f0ee1a9a-154c-4c27-b964-94a9a13761e6","Type":"ContainerDied","Data":"02891e9fbf9cbcc1dc7c603e4f651eaa523df9e408523cfa4d91084e62782e9e"} Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.107442 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" event={"ID":"f0ee1a9a-154c-4c27-b964-94a9a13761e6","Type":"ContainerDied","Data":"9d7ce585160cd440cbabcbe12d43876f016a0da5ee659a031293e9a2e5298dc3"} Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.107459 4841 scope.go:117] "RemoveContainer" containerID="02891e9fbf9cbcc1dc7c603e4f651eaa523df9e408523cfa4d91084e62782e9e" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.107595 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.147075 4841 scope.go:117] "RemoveContainer" containerID="ed3d5bc50518252c2ee01df8344e1617eff8e0bb9a7ec2dd76286af196a5d4c8" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.160503 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0ee1a9a-154c-4c27-b964-94a9a13761e6-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.166391 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7"] Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.177289 4841 scope.go:117] "RemoveContainer" containerID="02891e9fbf9cbcc1dc7c603e4f651eaa523df9e408523cfa4d91084e62782e9e" Mar 13 09:35:52 crc kubenswrapper[4841]: E0313 09:35:52.177683 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02891e9fbf9cbcc1dc7c603e4f651eaa523df9e408523cfa4d91084e62782e9e\": container with ID starting with 02891e9fbf9cbcc1dc7c603e4f651eaa523df9e408523cfa4d91084e62782e9e not found: ID does not exist" containerID="02891e9fbf9cbcc1dc7c603e4f651eaa523df9e408523cfa4d91084e62782e9e" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.177710 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02891e9fbf9cbcc1dc7c603e4f651eaa523df9e408523cfa4d91084e62782e9e"} err="failed to get container status \"02891e9fbf9cbcc1dc7c603e4f651eaa523df9e408523cfa4d91084e62782e9e\": rpc error: code = NotFound desc = could not find container \"02891e9fbf9cbcc1dc7c603e4f651eaa523df9e408523cfa4d91084e62782e9e\": container with ID starting with 02891e9fbf9cbcc1dc7c603e4f651eaa523df9e408523cfa4d91084e62782e9e not found: ID does not exist" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.177728 4841 scope.go:117] "RemoveContainer" containerID="ed3d5bc50518252c2ee01df8344e1617eff8e0bb9a7ec2dd76286af196a5d4c8" Mar 13 09:35:52 crc kubenswrapper[4841]: E0313 09:35:52.178129 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed3d5bc50518252c2ee01df8344e1617eff8e0bb9a7ec2dd76286af196a5d4c8\": container with ID starting with ed3d5bc50518252c2ee01df8344e1617eff8e0bb9a7ec2dd76286af196a5d4c8 not found: ID does not exist" containerID="ed3d5bc50518252c2ee01df8344e1617eff8e0bb9a7ec2dd76286af196a5d4c8" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.178152 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed3d5bc50518252c2ee01df8344e1617eff8e0bb9a7ec2dd76286af196a5d4c8"} err="failed to get container status \"ed3d5bc50518252c2ee01df8344e1617eff8e0bb9a7ec2dd76286af196a5d4c8\": rpc error: code = NotFound desc = could not find container \"ed3d5bc50518252c2ee01df8344e1617eff8e0bb9a7ec2dd76286af196a5d4c8\": container with ID starting with ed3d5bc50518252c2ee01df8344e1617eff8e0bb9a7ec2dd76286af196a5d4c8 not found: ID does not exist" Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.178786 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-m6xj7"] Mar 13 09:35:52 crc kubenswrapper[4841]: I0313 09:35:52.382680 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-6rf6l"] Mar 13 09:35:52 crc kubenswrapper[4841]: W0313 09:35:52.383184 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c632011_0a35_4eaa_a7f5_8d86466858ca.slice/crio-ff2d00be32c54dcf2b1df6c553bc90087b23a0712847f0715bde2c0faddde939 WatchSource:0}: Error finding container ff2d00be32c54dcf2b1df6c553bc90087b23a0712847f0715bde2c0faddde939: Status 404 returned error can't find the container with id ff2d00be32c54dcf2b1df6c553bc90087b23a0712847f0715bde2c0faddde939 Mar 13 09:35:53 crc kubenswrapper[4841]: I0313 09:35:53.120829 4841 generic.go:334] "Generic (PLEG): container finished" podID="2c632011-0a35-4eaa-a7f5-8d86466858ca" containerID="1752be021d1eb84de23936c71690ec694673ae40995319ab183b594f3d9e1668" exitCode=0 Mar 13 09:35:53 crc kubenswrapper[4841]: I0313 09:35:53.120903 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" event={"ID":"2c632011-0a35-4eaa-a7f5-8d86466858ca","Type":"ContainerDied","Data":"1752be021d1eb84de23936c71690ec694673ae40995319ab183b594f3d9e1668"} Mar 13 09:35:53 crc kubenswrapper[4841]: I0313 09:35:53.121235 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" event={"ID":"2c632011-0a35-4eaa-a7f5-8d86466858ca","Type":"ContainerStarted","Data":"ff2d00be32c54dcf2b1df6c553bc90087b23a0712847f0715bde2c0faddde939"} Mar 13 09:35:54 crc kubenswrapper[4841]: I0313 09:35:54.007090 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0ee1a9a-154c-4c27-b964-94a9a13761e6" path="/var/lib/kubelet/pods/f0ee1a9a-154c-4c27-b964-94a9a13761e6/volumes" Mar 13 09:35:54 crc kubenswrapper[4841]: I0313 09:35:54.137922 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" event={"ID":"2c632011-0a35-4eaa-a7f5-8d86466858ca","Type":"ContainerStarted","Data":"b773d01805386fe3c8fb6f458553c45ee399e0e97021cdb839e8b81a93e897d9"} Mar 13 09:35:54 crc kubenswrapper[4841]: I0313 09:35:54.139315 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:35:54 crc kubenswrapper[4841]: I0313 09:35:54.166394 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" podStartSLOduration=3.166372049 podStartE2EDuration="3.166372049s" podCreationTimestamp="2026-03-13 09:35:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:35:54.15648567 +0000 UTC m=+1436.886385861" watchObservedRunningTime="2026-03-13 09:35:54.166372049 +0000 UTC m=+1436.896272230" Mar 13 09:36:00 crc kubenswrapper[4841]: I0313 09:36:00.153637 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556576-jbxhq"] Mar 13 09:36:00 crc kubenswrapper[4841]: E0313 09:36:00.154714 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ee1a9a-154c-4c27-b964-94a9a13761e6" containerName="dnsmasq-dns" Mar 13 09:36:00 crc kubenswrapper[4841]: I0313 09:36:00.154732 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ee1a9a-154c-4c27-b964-94a9a13761e6" containerName="dnsmasq-dns" Mar 13 09:36:00 crc kubenswrapper[4841]: E0313 09:36:00.154752 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ee1a9a-154c-4c27-b964-94a9a13761e6" containerName="init" Mar 13 09:36:00 crc kubenswrapper[4841]: I0313 09:36:00.154761 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ee1a9a-154c-4c27-b964-94a9a13761e6" containerName="init" Mar 13 09:36:00 crc kubenswrapper[4841]: I0313 09:36:00.155048 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ee1a9a-154c-4c27-b964-94a9a13761e6" containerName="dnsmasq-dns" Mar 13 09:36:00 crc kubenswrapper[4841]: I0313 09:36:00.155819 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556576-jbxhq" Mar 13 09:36:00 crc kubenswrapper[4841]: I0313 09:36:00.158961 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:36:00 crc kubenswrapper[4841]: I0313 09:36:00.159238 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:36:00 crc kubenswrapper[4841]: I0313 09:36:00.159650 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:36:00 crc kubenswrapper[4841]: I0313 09:36:00.172719 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556576-jbxhq"] Mar 13 09:36:00 crc kubenswrapper[4841]: I0313 09:36:00.258815 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm95q\" (UniqueName: \"kubernetes.io/projected/7007e603-335c-42b3-af69-03ec5159c667-kube-api-access-rm95q\") pod \"auto-csr-approver-29556576-jbxhq\" (UID: \"7007e603-335c-42b3-af69-03ec5159c667\") " pod="openshift-infra/auto-csr-approver-29556576-jbxhq" Mar 13 09:36:00 crc kubenswrapper[4841]: I0313 09:36:00.361280 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm95q\" (UniqueName: \"kubernetes.io/projected/7007e603-335c-42b3-af69-03ec5159c667-kube-api-access-rm95q\") pod \"auto-csr-approver-29556576-jbxhq\" (UID: \"7007e603-335c-42b3-af69-03ec5159c667\") " pod="openshift-infra/auto-csr-approver-29556576-jbxhq" Mar 13 09:36:00 crc kubenswrapper[4841]: I0313 09:36:00.388556 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm95q\" (UniqueName: \"kubernetes.io/projected/7007e603-335c-42b3-af69-03ec5159c667-kube-api-access-rm95q\") pod \"auto-csr-approver-29556576-jbxhq\" (UID: \"7007e603-335c-42b3-af69-03ec5159c667\") " pod="openshift-infra/auto-csr-approver-29556576-jbxhq" Mar 13 09:36:00 crc kubenswrapper[4841]: I0313 09:36:00.483912 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556576-jbxhq" Mar 13 09:36:00 crc kubenswrapper[4841]: I0313 09:36:00.973588 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556576-jbxhq"] Mar 13 09:36:00 crc kubenswrapper[4841]: W0313 09:36:00.979628 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7007e603_335c_42b3_af69_03ec5159c667.slice/crio-89aa1b6bb7788fe863e187e2fc8249d79bd06bf5661d1ad39f83f196e43e3581 WatchSource:0}: Error finding container 89aa1b6bb7788fe863e187e2fc8249d79bd06bf5661d1ad39f83f196e43e3581: Status 404 returned error can't find the container with id 89aa1b6bb7788fe863e187e2fc8249d79bd06bf5661d1ad39f83f196e43e3581 Mar 13 09:36:01 crc kubenswrapper[4841]: I0313 09:36:01.213716 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556576-jbxhq" event={"ID":"7007e603-335c-42b3-af69-03ec5159c667","Type":"ContainerStarted","Data":"89aa1b6bb7788fe863e187e2fc8249d79bd06bf5661d1ad39f83f196e43e3581"} Mar 13 09:36:01 crc kubenswrapper[4841]: I0313 09:36:01.897652 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6df4f56c-6rf6l" Mar 13 09:36:01 crc kubenswrapper[4841]: I0313 09:36:01.970640 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-wzvkv"] Mar 13 09:36:01 crc kubenswrapper[4841]: I0313 09:36:01.970884 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" podUID="d622a2f4-f5a6-4999-9395-329efdeabb2d" containerName="dnsmasq-dns" containerID="cri-o://7385fda1e7040574f56bf3d6013ddb93e411c159cd890703c7d93ceccc0d7307" gracePeriod=10 Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.223689 4841 generic.go:334] "Generic (PLEG): container finished" podID="d622a2f4-f5a6-4999-9395-329efdeabb2d" containerID="7385fda1e7040574f56bf3d6013ddb93e411c159cd890703c7d93ceccc0d7307" exitCode=0 Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.223915 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" event={"ID":"d622a2f4-f5a6-4999-9395-329efdeabb2d","Type":"ContainerDied","Data":"7385fda1e7040574f56bf3d6013ddb93e411c159cd890703c7d93ceccc0d7307"} Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.508670 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.603756 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-dns-svc\") pod \"d622a2f4-f5a6-4999-9395-329efdeabb2d\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.603973 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-dns-swift-storage-0\") pod \"d622a2f4-f5a6-4999-9395-329efdeabb2d\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.604054 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-ovsdbserver-sb\") pod \"d622a2f4-f5a6-4999-9395-329efdeabb2d\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.604189 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-config\") pod \"d622a2f4-f5a6-4999-9395-329efdeabb2d\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.604377 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-openstack-edpm-ipam\") pod \"d622a2f4-f5a6-4999-9395-329efdeabb2d\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.604571 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-ovsdbserver-nb\") pod \"d622a2f4-f5a6-4999-9395-329efdeabb2d\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.604687 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44phh\" (UniqueName: \"kubernetes.io/projected/d622a2f4-f5a6-4999-9395-329efdeabb2d-kube-api-access-44phh\") pod \"d622a2f4-f5a6-4999-9395-329efdeabb2d\" (UID: \"d622a2f4-f5a6-4999-9395-329efdeabb2d\") " Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.632738 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d622a2f4-f5a6-4999-9395-329efdeabb2d-kube-api-access-44phh" (OuterVolumeSpecName: "kube-api-access-44phh") pod "d622a2f4-f5a6-4999-9395-329efdeabb2d" (UID: "d622a2f4-f5a6-4999-9395-329efdeabb2d"). InnerVolumeSpecName "kube-api-access-44phh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.663145 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-config" (OuterVolumeSpecName: "config") pod "d622a2f4-f5a6-4999-9395-329efdeabb2d" (UID: "d622a2f4-f5a6-4999-9395-329efdeabb2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.666640 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d622a2f4-f5a6-4999-9395-329efdeabb2d" (UID: "d622a2f4-f5a6-4999-9395-329efdeabb2d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.670340 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "d622a2f4-f5a6-4999-9395-329efdeabb2d" (UID: "d622a2f4-f5a6-4999-9395-329efdeabb2d"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.671445 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d622a2f4-f5a6-4999-9395-329efdeabb2d" (UID: "d622a2f4-f5a6-4999-9395-329efdeabb2d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.688218 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d622a2f4-f5a6-4999-9395-329efdeabb2d" (UID: "d622a2f4-f5a6-4999-9395-329efdeabb2d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.688232 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d622a2f4-f5a6-4999-9395-329efdeabb2d" (UID: "d622a2f4-f5a6-4999-9395-329efdeabb2d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.707177 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.707214 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.707229 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44phh\" (UniqueName: \"kubernetes.io/projected/d622a2f4-f5a6-4999-9395-329efdeabb2d-kube-api-access-44phh\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.707242 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.707252 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.707279 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:02 crc kubenswrapper[4841]: I0313 09:36:02.707291 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d622a2f4-f5a6-4999-9395-329efdeabb2d-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:03 crc kubenswrapper[4841]: I0313 09:36:03.240124 4841 generic.go:334] "Generic (PLEG): container finished" podID="7007e603-335c-42b3-af69-03ec5159c667" containerID="533ad17ad7245e2e5e9cf12281353640ebedf1c55eb5e83ded41729cecc683ed" exitCode=0 Mar 13 09:36:03 crc kubenswrapper[4841]: I0313 09:36:03.240243 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556576-jbxhq" event={"ID":"7007e603-335c-42b3-af69-03ec5159c667","Type":"ContainerDied","Data":"533ad17ad7245e2e5e9cf12281353640ebedf1c55eb5e83ded41729cecc683ed"} Mar 13 09:36:03 crc kubenswrapper[4841]: I0313 09:36:03.247382 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" event={"ID":"d622a2f4-f5a6-4999-9395-329efdeabb2d","Type":"ContainerDied","Data":"253f0c157b4085ba092af452101f228d3584f712c5d58d0a980ec3fdd9c4266b"} Mar 13 09:36:03 crc kubenswrapper[4841]: I0313 09:36:03.247442 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-wzvkv" Mar 13 09:36:03 crc kubenswrapper[4841]: I0313 09:36:03.247491 4841 scope.go:117] "RemoveContainer" containerID="7385fda1e7040574f56bf3d6013ddb93e411c159cd890703c7d93ceccc0d7307" Mar 13 09:36:03 crc kubenswrapper[4841]: I0313 09:36:03.271621 4841 scope.go:117] "RemoveContainer" containerID="008416c013d24f9cecd11dcbc34bb2d7aef651fc5fadd9258ba36a9599104389" Mar 13 09:36:03 crc kubenswrapper[4841]: I0313 09:36:03.306248 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-wzvkv"] Mar 13 09:36:03 crc kubenswrapper[4841]: I0313 09:36:03.316039 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-wzvkv"] Mar 13 09:36:04 crc kubenswrapper[4841]: I0313 09:36:04.008521 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d622a2f4-f5a6-4999-9395-329efdeabb2d" path="/var/lib/kubelet/pods/d622a2f4-f5a6-4999-9395-329efdeabb2d/volumes" Mar 13 09:36:04 crc kubenswrapper[4841]: I0313 09:36:04.576239 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556576-jbxhq" Mar 13 09:36:04 crc kubenswrapper[4841]: I0313 09:36:04.642489 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm95q\" (UniqueName: \"kubernetes.io/projected/7007e603-335c-42b3-af69-03ec5159c667-kube-api-access-rm95q\") pod \"7007e603-335c-42b3-af69-03ec5159c667\" (UID: \"7007e603-335c-42b3-af69-03ec5159c667\") " Mar 13 09:36:04 crc kubenswrapper[4841]: I0313 09:36:04.650815 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7007e603-335c-42b3-af69-03ec5159c667-kube-api-access-rm95q" (OuterVolumeSpecName: "kube-api-access-rm95q") pod "7007e603-335c-42b3-af69-03ec5159c667" (UID: "7007e603-335c-42b3-af69-03ec5159c667"). InnerVolumeSpecName "kube-api-access-rm95q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:36:04 crc kubenswrapper[4841]: I0313 09:36:04.745296 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm95q\" (UniqueName: \"kubernetes.io/projected/7007e603-335c-42b3-af69-03ec5159c667-kube-api-access-rm95q\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:05 crc kubenswrapper[4841]: I0313 09:36:05.275512 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556576-jbxhq" event={"ID":"7007e603-335c-42b3-af69-03ec5159c667","Type":"ContainerDied","Data":"89aa1b6bb7788fe863e187e2fc8249d79bd06bf5661d1ad39f83f196e43e3581"} Mar 13 09:36:05 crc kubenswrapper[4841]: I0313 09:36:05.275813 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89aa1b6bb7788fe863e187e2fc8249d79bd06bf5661d1ad39f83f196e43e3581" Mar 13 09:36:05 crc kubenswrapper[4841]: I0313 09:36:05.275615 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556576-jbxhq" Mar 13 09:36:05 crc kubenswrapper[4841]: I0313 09:36:05.650783 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556570-cc6b9"] Mar 13 09:36:05 crc kubenswrapper[4841]: I0313 09:36:05.660844 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556570-cc6b9"] Mar 13 09:36:06 crc kubenswrapper[4841]: I0313 09:36:06.026090 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72dabbac-c073-43b6-a4fc-7c49b98138c3" path="/var/lib/kubelet/pods/72dabbac-c073-43b6-a4fc-7c49b98138c3/volumes" Mar 13 09:36:10 crc kubenswrapper[4841]: I0313 09:36:10.329409 4841 generic.go:334] "Generic (PLEG): container finished" podID="b5efe6ff-d5eb-4fa9-9496-1838d05f625a" containerID="ab1859a05716e393a13502a6db4e331fd2f27aecab80e3ed29795da8eebbfbec" exitCode=0 Mar 13 09:36:10 crc kubenswrapper[4841]: I0313 09:36:10.329495 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b5efe6ff-d5eb-4fa9-9496-1838d05f625a","Type":"ContainerDied","Data":"ab1859a05716e393a13502a6db4e331fd2f27aecab80e3ed29795da8eebbfbec"} Mar 13 09:36:11 crc kubenswrapper[4841]: I0313 09:36:11.341306 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b5efe6ff-d5eb-4fa9-9496-1838d05f625a","Type":"ContainerStarted","Data":"2094283b56a10f421c5aeb5fbbf9acf8de20184eb986c36ef46344d1dc27ac50"} Mar 13 09:36:11 crc kubenswrapper[4841]: I0313 09:36:11.342595 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 09:36:11 crc kubenswrapper[4841]: I0313 09:36:11.343196 4841 generic.go:334] "Generic (PLEG): container finished" podID="469aec79-a7a3-4ae1-b00a-94f47a6d4df9" containerID="3be35f73422e347e4521efbc6546a1e15efdd364bef88204fa8b32a4ab56c400" exitCode=0 Mar 13 09:36:11 crc kubenswrapper[4841]: I0313 09:36:11.343235 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"469aec79-a7a3-4ae1-b00a-94f47a6d4df9","Type":"ContainerDied","Data":"3be35f73422e347e4521efbc6546a1e15efdd364bef88204fa8b32a4ab56c400"} Mar 13 09:36:11 crc kubenswrapper[4841]: I0313 09:36:11.377600 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.377580666 podStartE2EDuration="36.377580666s" podCreationTimestamp="2026-03-13 09:35:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:36:11.369599677 +0000 UTC m=+1454.099499888" watchObservedRunningTime="2026-03-13 09:36:11.377580666 +0000 UTC m=+1454.107480867" Mar 13 09:36:12 crc kubenswrapper[4841]: I0313 09:36:12.352670 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"469aec79-a7a3-4ae1-b00a-94f47a6d4df9","Type":"ContainerStarted","Data":"70dfbac8e0cf782c49a9e3efe06f980acad289993333dc19dbbefa45d41a38d2"} Mar 13 09:36:12 crc kubenswrapper[4841]: I0313 09:36:12.353123 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:36:12 crc kubenswrapper[4841]: I0313 09:36:12.376214 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.376197373 podStartE2EDuration="36.376197373s" podCreationTimestamp="2026-03-13 09:35:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:36:12.372559079 +0000 UTC m=+1455.102459270" watchObservedRunningTime="2026-03-13 09:36:12.376197373 +0000 UTC m=+1455.106097564" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.149743 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96"] Mar 13 09:36:15 crc kubenswrapper[4841]: E0313 09:36:15.151020 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7007e603-335c-42b3-af69-03ec5159c667" containerName="oc" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.151044 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7007e603-335c-42b3-af69-03ec5159c667" containerName="oc" Mar 13 09:36:15 crc kubenswrapper[4841]: E0313 09:36:15.151071 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d622a2f4-f5a6-4999-9395-329efdeabb2d" containerName="init" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.151083 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d622a2f4-f5a6-4999-9395-329efdeabb2d" containerName="init" Mar 13 09:36:15 crc kubenswrapper[4841]: E0313 09:36:15.151141 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d622a2f4-f5a6-4999-9395-329efdeabb2d" containerName="dnsmasq-dns" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.151155 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d622a2f4-f5a6-4999-9395-329efdeabb2d" containerName="dnsmasq-dns" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.151498 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d622a2f4-f5a6-4999-9395-329efdeabb2d" containerName="dnsmasq-dns" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.151531 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7007e603-335c-42b3-af69-03ec5159c667" containerName="oc" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.152583 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.156637 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.157038 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.157190 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rv6jt" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.158684 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.167449 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96"] Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.285240 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wz9b\" (UniqueName: \"kubernetes.io/projected/413c3ede-4bdb-444c-b90d-5b07c5507a52-kube-api-access-4wz9b\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96\" (UID: \"413c3ede-4bdb-444c-b90d-5b07c5507a52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.285498 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/413c3ede-4bdb-444c-b90d-5b07c5507a52-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96\" (UID: \"413c3ede-4bdb-444c-b90d-5b07c5507a52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.285737 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/413c3ede-4bdb-444c-b90d-5b07c5507a52-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96\" (UID: \"413c3ede-4bdb-444c-b90d-5b07c5507a52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.285848 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/413c3ede-4bdb-444c-b90d-5b07c5507a52-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96\" (UID: \"413c3ede-4bdb-444c-b90d-5b07c5507a52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.387748 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/413c3ede-4bdb-444c-b90d-5b07c5507a52-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96\" (UID: \"413c3ede-4bdb-444c-b90d-5b07c5507a52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.387904 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wz9b\" (UniqueName: \"kubernetes.io/projected/413c3ede-4bdb-444c-b90d-5b07c5507a52-kube-api-access-4wz9b\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96\" (UID: \"413c3ede-4bdb-444c-b90d-5b07c5507a52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.388001 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/413c3ede-4bdb-444c-b90d-5b07c5507a52-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96\" (UID: \"413c3ede-4bdb-444c-b90d-5b07c5507a52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.388108 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/413c3ede-4bdb-444c-b90d-5b07c5507a52-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96\" (UID: \"413c3ede-4bdb-444c-b90d-5b07c5507a52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.394439 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/413c3ede-4bdb-444c-b90d-5b07c5507a52-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96\" (UID: \"413c3ede-4bdb-444c-b90d-5b07c5507a52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.394697 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/413c3ede-4bdb-444c-b90d-5b07c5507a52-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96\" (UID: \"413c3ede-4bdb-444c-b90d-5b07c5507a52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.400366 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/413c3ede-4bdb-444c-b90d-5b07c5507a52-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96\" (UID: \"413c3ede-4bdb-444c-b90d-5b07c5507a52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.413439 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wz9b\" (UniqueName: \"kubernetes.io/projected/413c3ede-4bdb-444c-b90d-5b07c5507a52-kube-api-access-4wz9b\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96\" (UID: \"413c3ede-4bdb-444c-b90d-5b07c5507a52\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" Mar 13 09:36:15 crc kubenswrapper[4841]: I0313 09:36:15.504748 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" Mar 13 09:36:16 crc kubenswrapper[4841]: I0313 09:36:16.088953 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96"] Mar 13 09:36:16 crc kubenswrapper[4841]: W0313 09:36:16.094490 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod413c3ede_4bdb_444c_b90d_5b07c5507a52.slice/crio-905b92e58b9a5b4abe55b58f27666ed629eeea2430a4468d0a0812937fdaa61f WatchSource:0}: Error finding container 905b92e58b9a5b4abe55b58f27666ed629eeea2430a4468d0a0812937fdaa61f: Status 404 returned error can't find the container with id 905b92e58b9a5b4abe55b58f27666ed629eeea2430a4468d0a0812937fdaa61f Mar 13 09:36:16 crc kubenswrapper[4841]: I0313 09:36:16.388358 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" event={"ID":"413c3ede-4bdb-444c-b90d-5b07c5507a52","Type":"ContainerStarted","Data":"905b92e58b9a5b4abe55b58f27666ed629eeea2430a4468d0a0812937fdaa61f"} Mar 13 09:36:24 crc kubenswrapper[4841]: I0313 09:36:24.230860 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q7b5k"] Mar 13 09:36:24 crc kubenswrapper[4841]: I0313 09:36:24.234954 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7b5k" Mar 13 09:36:24 crc kubenswrapper[4841]: I0313 09:36:24.242466 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q7b5k"] Mar 13 09:36:24 crc kubenswrapper[4841]: I0313 09:36:24.283103 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/482d1d76-467c-4222-b6da-f8608636656e-catalog-content\") pod \"redhat-operators-q7b5k\" (UID: \"482d1d76-467c-4222-b6da-f8608636656e\") " pod="openshift-marketplace/redhat-operators-q7b5k" Mar 13 09:36:24 crc kubenswrapper[4841]: I0313 09:36:24.283307 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/482d1d76-467c-4222-b6da-f8608636656e-utilities\") pod \"redhat-operators-q7b5k\" (UID: \"482d1d76-467c-4222-b6da-f8608636656e\") " pod="openshift-marketplace/redhat-operators-q7b5k" Mar 13 09:36:24 crc kubenswrapper[4841]: I0313 09:36:24.283474 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvdv4\" (UniqueName: \"kubernetes.io/projected/482d1d76-467c-4222-b6da-f8608636656e-kube-api-access-nvdv4\") pod \"redhat-operators-q7b5k\" (UID: \"482d1d76-467c-4222-b6da-f8608636656e\") " pod="openshift-marketplace/redhat-operators-q7b5k" Mar 13 09:36:24 crc kubenswrapper[4841]: I0313 09:36:24.385771 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/482d1d76-467c-4222-b6da-f8608636656e-utilities\") pod \"redhat-operators-q7b5k\" (UID: \"482d1d76-467c-4222-b6da-f8608636656e\") " pod="openshift-marketplace/redhat-operators-q7b5k" Mar 13 09:36:24 crc kubenswrapper[4841]: I0313 09:36:24.385897 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvdv4\" (UniqueName: \"kubernetes.io/projected/482d1d76-467c-4222-b6da-f8608636656e-kube-api-access-nvdv4\") pod \"redhat-operators-q7b5k\" (UID: \"482d1d76-467c-4222-b6da-f8608636656e\") " pod="openshift-marketplace/redhat-operators-q7b5k" Mar 13 09:36:24 crc kubenswrapper[4841]: I0313 09:36:24.386029 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/482d1d76-467c-4222-b6da-f8608636656e-catalog-content\") pod \"redhat-operators-q7b5k\" (UID: \"482d1d76-467c-4222-b6da-f8608636656e\") " pod="openshift-marketplace/redhat-operators-q7b5k" Mar 13 09:36:24 crc kubenswrapper[4841]: I0313 09:36:24.386746 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/482d1d76-467c-4222-b6da-f8608636656e-catalog-content\") pod \"redhat-operators-q7b5k\" (UID: \"482d1d76-467c-4222-b6da-f8608636656e\") " pod="openshift-marketplace/redhat-operators-q7b5k" Mar 13 09:36:24 crc kubenswrapper[4841]: I0313 09:36:24.388601 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/482d1d76-467c-4222-b6da-f8608636656e-utilities\") pod \"redhat-operators-q7b5k\" (UID: \"482d1d76-467c-4222-b6da-f8608636656e\") " pod="openshift-marketplace/redhat-operators-q7b5k" Mar 13 09:36:24 crc kubenswrapper[4841]: I0313 09:36:24.430817 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvdv4\" (UniqueName: \"kubernetes.io/projected/482d1d76-467c-4222-b6da-f8608636656e-kube-api-access-nvdv4\") pod \"redhat-operators-q7b5k\" (UID: \"482d1d76-467c-4222-b6da-f8608636656e\") " pod="openshift-marketplace/redhat-operators-q7b5k" Mar 13 09:36:24 crc kubenswrapper[4841]: I0313 09:36:24.562857 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7b5k" Mar 13 09:36:26 crc kubenswrapper[4841]: I0313 09:36:26.210413 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 09:36:26 crc kubenswrapper[4841]: I0313 09:36:26.469450 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 09:36:26 crc kubenswrapper[4841]: I0313 09:36:26.503676 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" event={"ID":"413c3ede-4bdb-444c-b90d-5b07c5507a52","Type":"ContainerStarted","Data":"5059133a1a2e6c3408aaf7aece636c1821297f38b57bfae052dc9dcd6b5a30a8"} Mar 13 09:36:26 crc kubenswrapper[4841]: I0313 09:36:26.536675 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" podStartSLOduration=1.532902434 podStartE2EDuration="11.536655037s" podCreationTimestamp="2026-03-13 09:36:15 +0000 UTC" firstStartedPulling="2026-03-13 09:36:16.096819848 +0000 UTC m=+1458.826720039" lastFinishedPulling="2026-03-13 09:36:26.100572451 +0000 UTC m=+1468.830472642" observedRunningTime="2026-03-13 09:36:26.523870768 +0000 UTC m=+1469.253770959" watchObservedRunningTime="2026-03-13 09:36:26.536655037 +0000 UTC m=+1469.266555228" Mar 13 09:36:26 crc kubenswrapper[4841]: I0313 09:36:26.574844 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q7b5k"] Mar 13 09:36:27 crc kubenswrapper[4841]: I0313 09:36:27.514069 4841 generic.go:334] "Generic (PLEG): container finished" podID="482d1d76-467c-4222-b6da-f8608636656e" containerID="732df105d0cecc5af35a52a05d77005f8e578173e6075cc8e2d23f528da835cc" exitCode=0 Mar 13 09:36:27 crc kubenswrapper[4841]: I0313 09:36:27.514348 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7b5k" event={"ID":"482d1d76-467c-4222-b6da-f8608636656e","Type":"ContainerDied","Data":"732df105d0cecc5af35a52a05d77005f8e578173e6075cc8e2d23f528da835cc"} Mar 13 09:36:27 crc kubenswrapper[4841]: I0313 09:36:27.514457 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7b5k" event={"ID":"482d1d76-467c-4222-b6da-f8608636656e","Type":"ContainerStarted","Data":"338b10113a7dc6e90f7350ece75ba34b32dc4fdea711bb8e33f27c335dbb9153"} Mar 13 09:36:29 crc kubenswrapper[4841]: I0313 09:36:29.534025 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7b5k" event={"ID":"482d1d76-467c-4222-b6da-f8608636656e","Type":"ContainerStarted","Data":"733980a20db22f20f0cb0aacb87aa532babd0a89fe586e19c46774a91b264e60"} Mar 13 09:36:31 crc kubenswrapper[4841]: I0313 09:36:31.557621 4841 generic.go:334] "Generic (PLEG): container finished" podID="482d1d76-467c-4222-b6da-f8608636656e" containerID="733980a20db22f20f0cb0aacb87aa532babd0a89fe586e19c46774a91b264e60" exitCode=0 Mar 13 09:36:31 crc kubenswrapper[4841]: I0313 09:36:31.557710 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7b5k" event={"ID":"482d1d76-467c-4222-b6da-f8608636656e","Type":"ContainerDied","Data":"733980a20db22f20f0cb0aacb87aa532babd0a89fe586e19c46774a91b264e60"} Mar 13 09:36:32 crc kubenswrapper[4841]: I0313 09:36:32.574169 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7b5k" event={"ID":"482d1d76-467c-4222-b6da-f8608636656e","Type":"ContainerStarted","Data":"ac4eca77ae148d58e284ebc235a6d9b3b86844e0ea18fbfa1fc2b2b4474583b0"} Mar 13 09:36:32 crc kubenswrapper[4841]: I0313 09:36:32.600756 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q7b5k" podStartSLOduration=4.070143672 podStartE2EDuration="8.600719278s" podCreationTimestamp="2026-03-13 09:36:24 +0000 UTC" firstStartedPulling="2026-03-13 09:36:27.515538748 +0000 UTC m=+1470.245438939" lastFinishedPulling="2026-03-13 09:36:32.046114344 +0000 UTC m=+1474.776014545" observedRunningTime="2026-03-13 09:36:32.596107755 +0000 UTC m=+1475.326007956" watchObservedRunningTime="2026-03-13 09:36:32.600719278 +0000 UTC m=+1475.330619519" Mar 13 09:36:34 crc kubenswrapper[4841]: I0313 09:36:34.563298 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q7b5k" Mar 13 09:36:34 crc kubenswrapper[4841]: I0313 09:36:34.563762 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q7b5k" Mar 13 09:36:35 crc kubenswrapper[4841]: I0313 09:36:35.621809 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q7b5k" podUID="482d1d76-467c-4222-b6da-f8608636656e" containerName="registry-server" probeResult="failure" output=< Mar 13 09:36:35 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Mar 13 09:36:35 crc kubenswrapper[4841]: > Mar 13 09:36:36 crc kubenswrapper[4841]: I0313 09:36:36.647040 4841 generic.go:334] "Generic (PLEG): container finished" podID="413c3ede-4bdb-444c-b90d-5b07c5507a52" containerID="5059133a1a2e6c3408aaf7aece636c1821297f38b57bfae052dc9dcd6b5a30a8" exitCode=0 Mar 13 09:36:36 crc kubenswrapper[4841]: I0313 09:36:36.647090 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" event={"ID":"413c3ede-4bdb-444c-b90d-5b07c5507a52","Type":"ContainerDied","Data":"5059133a1a2e6c3408aaf7aece636c1821297f38b57bfae052dc9dcd6b5a30a8"} Mar 13 09:36:37 crc kubenswrapper[4841]: I0313 09:36:37.750124 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gmhqg"] Mar 13 09:36:37 crc kubenswrapper[4841]: I0313 09:36:37.752292 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gmhqg" Mar 13 09:36:37 crc kubenswrapper[4841]: I0313 09:36:37.778971 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gmhqg"] Mar 13 09:36:37 crc kubenswrapper[4841]: I0313 09:36:37.798256 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1e82bc6-797b-4b71-9413-014bfe7f7e3c-utilities\") pod \"certified-operators-gmhqg\" (UID: \"f1e82bc6-797b-4b71-9413-014bfe7f7e3c\") " pod="openshift-marketplace/certified-operators-gmhqg" Mar 13 09:36:37 crc kubenswrapper[4841]: I0313 09:36:37.798317 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qkxl\" (UniqueName: \"kubernetes.io/projected/f1e82bc6-797b-4b71-9413-014bfe7f7e3c-kube-api-access-4qkxl\") pod \"certified-operators-gmhqg\" (UID: \"f1e82bc6-797b-4b71-9413-014bfe7f7e3c\") " pod="openshift-marketplace/certified-operators-gmhqg" Mar 13 09:36:37 crc kubenswrapper[4841]: I0313 09:36:37.798419 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1e82bc6-797b-4b71-9413-014bfe7f7e3c-catalog-content\") pod \"certified-operators-gmhqg\" (UID: \"f1e82bc6-797b-4b71-9413-014bfe7f7e3c\") " pod="openshift-marketplace/certified-operators-gmhqg" Mar 13 09:36:37 crc kubenswrapper[4841]: I0313 09:36:37.899787 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1e82bc6-797b-4b71-9413-014bfe7f7e3c-utilities\") pod \"certified-operators-gmhqg\" (UID: \"f1e82bc6-797b-4b71-9413-014bfe7f7e3c\") " pod="openshift-marketplace/certified-operators-gmhqg" Mar 13 09:36:37 crc kubenswrapper[4841]: I0313 09:36:37.899863 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qkxl\" (UniqueName: \"kubernetes.io/projected/f1e82bc6-797b-4b71-9413-014bfe7f7e3c-kube-api-access-4qkxl\") pod \"certified-operators-gmhqg\" (UID: \"f1e82bc6-797b-4b71-9413-014bfe7f7e3c\") " pod="openshift-marketplace/certified-operators-gmhqg" Mar 13 09:36:37 crc kubenswrapper[4841]: I0313 09:36:37.900006 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1e82bc6-797b-4b71-9413-014bfe7f7e3c-catalog-content\") pod \"certified-operators-gmhqg\" (UID: \"f1e82bc6-797b-4b71-9413-014bfe7f7e3c\") " pod="openshift-marketplace/certified-operators-gmhqg" Mar 13 09:36:37 crc kubenswrapper[4841]: I0313 09:36:37.900345 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1e82bc6-797b-4b71-9413-014bfe7f7e3c-utilities\") pod \"certified-operators-gmhqg\" (UID: \"f1e82bc6-797b-4b71-9413-014bfe7f7e3c\") " pod="openshift-marketplace/certified-operators-gmhqg" Mar 13 09:36:37 crc kubenswrapper[4841]: I0313 09:36:37.900562 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1e82bc6-797b-4b71-9413-014bfe7f7e3c-catalog-content\") pod \"certified-operators-gmhqg\" (UID: \"f1e82bc6-797b-4b71-9413-014bfe7f7e3c\") " pod="openshift-marketplace/certified-operators-gmhqg" Mar 13 09:36:37 crc kubenswrapper[4841]: I0313 09:36:37.941646 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qkxl\" (UniqueName: \"kubernetes.io/projected/f1e82bc6-797b-4b71-9413-014bfe7f7e3c-kube-api-access-4qkxl\") pod \"certified-operators-gmhqg\" (UID: \"f1e82bc6-797b-4b71-9413-014bfe7f7e3c\") " pod="openshift-marketplace/certified-operators-gmhqg" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.075424 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gmhqg" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.608820 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.686916 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" event={"ID":"413c3ede-4bdb-444c-b90d-5b07c5507a52","Type":"ContainerDied","Data":"905b92e58b9a5b4abe55b58f27666ed629eeea2430a4468d0a0812937fdaa61f"} Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.686958 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="905b92e58b9a5b4abe55b58f27666ed629eeea2430a4468d0a0812937fdaa61f" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.687018 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.719255 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/413c3ede-4bdb-444c-b90d-5b07c5507a52-repo-setup-combined-ca-bundle\") pod \"413c3ede-4bdb-444c-b90d-5b07c5507a52\" (UID: \"413c3ede-4bdb-444c-b90d-5b07c5507a52\") " Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.719329 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/413c3ede-4bdb-444c-b90d-5b07c5507a52-inventory\") pod \"413c3ede-4bdb-444c-b90d-5b07c5507a52\" (UID: \"413c3ede-4bdb-444c-b90d-5b07c5507a52\") " Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.719384 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wz9b\" (UniqueName: \"kubernetes.io/projected/413c3ede-4bdb-444c-b90d-5b07c5507a52-kube-api-access-4wz9b\") pod \"413c3ede-4bdb-444c-b90d-5b07c5507a52\" (UID: \"413c3ede-4bdb-444c-b90d-5b07c5507a52\") " Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.719415 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/413c3ede-4bdb-444c-b90d-5b07c5507a52-ssh-key-openstack-edpm-ipam\") pod \"413c3ede-4bdb-444c-b90d-5b07c5507a52\" (UID: \"413c3ede-4bdb-444c-b90d-5b07c5507a52\") " Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.729666 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/413c3ede-4bdb-444c-b90d-5b07c5507a52-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "413c3ede-4bdb-444c-b90d-5b07c5507a52" (UID: "413c3ede-4bdb-444c-b90d-5b07c5507a52"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.740430 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/413c3ede-4bdb-444c-b90d-5b07c5507a52-kube-api-access-4wz9b" (OuterVolumeSpecName: "kube-api-access-4wz9b") pod "413c3ede-4bdb-444c-b90d-5b07c5507a52" (UID: "413c3ede-4bdb-444c-b90d-5b07c5507a52"). InnerVolumeSpecName "kube-api-access-4wz9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.741673 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gmhqg"] Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.796026 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/413c3ede-4bdb-444c-b90d-5b07c5507a52-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "413c3ede-4bdb-444c-b90d-5b07c5507a52" (UID: "413c3ede-4bdb-444c-b90d-5b07c5507a52"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.822160 4841 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/413c3ede-4bdb-444c-b90d-5b07c5507a52-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.822212 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wz9b\" (UniqueName: \"kubernetes.io/projected/413c3ede-4bdb-444c-b90d-5b07c5507a52-kube-api-access-4wz9b\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.822225 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/413c3ede-4bdb-444c-b90d-5b07c5507a52-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.827154 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/413c3ede-4bdb-444c-b90d-5b07c5507a52-inventory" (OuterVolumeSpecName: "inventory") pod "413c3ede-4bdb-444c-b90d-5b07c5507a52" (UID: "413c3ede-4bdb-444c-b90d-5b07c5507a52"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.829043 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr"] Mar 13 09:36:38 crc kubenswrapper[4841]: E0313 09:36:38.829568 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="413c3ede-4bdb-444c-b90d-5b07c5507a52" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.829584 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="413c3ede-4bdb-444c-b90d-5b07c5507a52" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.829748 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="413c3ede-4bdb-444c-b90d-5b07c5507a52" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.830370 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.840433 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr"] Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.923150 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b538d385-dcf3-477e-b014-4b304c0be557-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-t97xr\" (UID: \"b538d385-dcf3-477e-b014-4b304c0be557\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.923197 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkcsg\" (UniqueName: \"kubernetes.io/projected/b538d385-dcf3-477e-b014-4b304c0be557-kube-api-access-dkcsg\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-t97xr\" (UID: \"b538d385-dcf3-477e-b014-4b304c0be557\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.923303 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b538d385-dcf3-477e-b014-4b304c0be557-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-t97xr\" (UID: \"b538d385-dcf3-477e-b014-4b304c0be557\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr" Mar 13 09:36:38 crc kubenswrapper[4841]: I0313 09:36:38.923448 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/413c3ede-4bdb-444c-b90d-5b07c5507a52-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:39 crc kubenswrapper[4841]: I0313 09:36:39.027641 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b538d385-dcf3-477e-b014-4b304c0be557-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-t97xr\" (UID: \"b538d385-dcf3-477e-b014-4b304c0be557\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr" Mar 13 09:36:39 crc kubenswrapper[4841]: I0313 09:36:39.028226 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b538d385-dcf3-477e-b014-4b304c0be557-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-t97xr\" (UID: \"b538d385-dcf3-477e-b014-4b304c0be557\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr" Mar 13 09:36:39 crc kubenswrapper[4841]: I0313 09:36:39.028373 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkcsg\" (UniqueName: \"kubernetes.io/projected/b538d385-dcf3-477e-b014-4b304c0be557-kube-api-access-dkcsg\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-t97xr\" (UID: \"b538d385-dcf3-477e-b014-4b304c0be557\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr" Mar 13 09:36:39 crc kubenswrapper[4841]: I0313 09:36:39.032154 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b538d385-dcf3-477e-b014-4b304c0be557-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-t97xr\" (UID: \"b538d385-dcf3-477e-b014-4b304c0be557\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr" Mar 13 09:36:39 crc kubenswrapper[4841]: I0313 09:36:39.041844 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b538d385-dcf3-477e-b014-4b304c0be557-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-t97xr\" (UID: \"b538d385-dcf3-477e-b014-4b304c0be557\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr" Mar 13 09:36:39 crc kubenswrapper[4841]: I0313 09:36:39.044985 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkcsg\" (UniqueName: \"kubernetes.io/projected/b538d385-dcf3-477e-b014-4b304c0be557-kube-api-access-dkcsg\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-t97xr\" (UID: \"b538d385-dcf3-477e-b014-4b304c0be557\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr" Mar 13 09:36:39 crc kubenswrapper[4841]: I0313 09:36:39.183162 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr" Mar 13 09:36:39 crc kubenswrapper[4841]: I0313 09:36:39.702533 4841 generic.go:334] "Generic (PLEG): container finished" podID="f1e82bc6-797b-4b71-9413-014bfe7f7e3c" containerID="32cdf49a38ca3e0aec210bcbd409088dea83e45e5e7d6bc55ede6e11fbcf0fef" exitCode=0 Mar 13 09:36:39 crc kubenswrapper[4841]: I0313 09:36:39.702651 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gmhqg" event={"ID":"f1e82bc6-797b-4b71-9413-014bfe7f7e3c","Type":"ContainerDied","Data":"32cdf49a38ca3e0aec210bcbd409088dea83e45e5e7d6bc55ede6e11fbcf0fef"} Mar 13 09:36:39 crc kubenswrapper[4841]: I0313 09:36:39.702907 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gmhqg" event={"ID":"f1e82bc6-797b-4b71-9413-014bfe7f7e3c","Type":"ContainerStarted","Data":"17575f98ec435da6afba123d4ce754b52e7013727d48e9c91449afcc0f263eec"} Mar 13 09:36:39 crc kubenswrapper[4841]: I0313 09:36:39.777768 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr"] Mar 13 09:36:39 crc kubenswrapper[4841]: W0313 09:36:39.780999 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb538d385_dcf3_477e_b014_4b304c0be557.slice/crio-b665653457a60b0bb3961dd4b753af7104563401957c5ffdaa9eb43f51f8cf7b WatchSource:0}: Error finding container b665653457a60b0bb3961dd4b753af7104563401957c5ffdaa9eb43f51f8cf7b: Status 404 returned error can't find the container with id b665653457a60b0bb3961dd4b753af7104563401957c5ffdaa9eb43f51f8cf7b Mar 13 09:36:40 crc kubenswrapper[4841]: I0313 09:36:40.716898 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr" event={"ID":"b538d385-dcf3-477e-b014-4b304c0be557","Type":"ContainerStarted","Data":"a627f51f7f1973ae313cc61b2d315f415da218be942b825d42f8f6ce498cfa48"} Mar 13 09:36:40 crc kubenswrapper[4841]: I0313 09:36:40.717353 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr" event={"ID":"b538d385-dcf3-477e-b014-4b304c0be557","Type":"ContainerStarted","Data":"b665653457a60b0bb3961dd4b753af7104563401957c5ffdaa9eb43f51f8cf7b"} Mar 13 09:36:40 crc kubenswrapper[4841]: I0313 09:36:40.746470 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr" podStartSLOduration=2.333229576 podStartE2EDuration="2.746442589s" podCreationTimestamp="2026-03-13 09:36:38 +0000 UTC" firstStartedPulling="2026-03-13 09:36:39.783828775 +0000 UTC m=+1482.513728966" lastFinishedPulling="2026-03-13 09:36:40.197041748 +0000 UTC m=+1482.926941979" observedRunningTime="2026-03-13 09:36:40.739785861 +0000 UTC m=+1483.469686052" watchObservedRunningTime="2026-03-13 09:36:40.746442589 +0000 UTC m=+1483.476342790" Mar 13 09:36:41 crc kubenswrapper[4841]: E0313 09:36:41.311715 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1e82bc6_797b_4b71_9413_014bfe7f7e3c.slice/crio-conmon-7288aa6b38550015e2704010267e6e6fc6c2d978015f5c2aeee1c59b1d2d5b62.scope\": RecentStats: unable to find data in memory cache]" Mar 13 09:36:41 crc kubenswrapper[4841]: I0313 09:36:41.728348 4841 generic.go:334] "Generic (PLEG): container finished" podID="f1e82bc6-797b-4b71-9413-014bfe7f7e3c" containerID="7288aa6b38550015e2704010267e6e6fc6c2d978015f5c2aeee1c59b1d2d5b62" exitCode=0 Mar 13 09:36:41 crc kubenswrapper[4841]: I0313 09:36:41.728626 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gmhqg" event={"ID":"f1e82bc6-797b-4b71-9413-014bfe7f7e3c","Type":"ContainerDied","Data":"7288aa6b38550015e2704010267e6e6fc6c2d978015f5c2aeee1c59b1d2d5b62"} Mar 13 09:36:42 crc kubenswrapper[4841]: I0313 09:36:42.743256 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gmhqg" event={"ID":"f1e82bc6-797b-4b71-9413-014bfe7f7e3c","Type":"ContainerStarted","Data":"e603acc4326f42db7af940d886859e57ea81ffac2b1cb53d1c3e68ee000b0a33"} Mar 13 09:36:42 crc kubenswrapper[4841]: I0313 09:36:42.769193 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gmhqg" podStartSLOduration=3.327735435 podStartE2EDuration="5.769174909s" podCreationTimestamp="2026-03-13 09:36:37 +0000 UTC" firstStartedPulling="2026-03-13 09:36:39.706675408 +0000 UTC m=+1482.436575589" lastFinishedPulling="2026-03-13 09:36:42.148114862 +0000 UTC m=+1484.878015063" observedRunningTime="2026-03-13 09:36:42.762348046 +0000 UTC m=+1485.492248247" watchObservedRunningTime="2026-03-13 09:36:42.769174909 +0000 UTC m=+1485.499075100" Mar 13 09:36:43 crc kubenswrapper[4841]: I0313 09:36:43.756703 4841 generic.go:334] "Generic (PLEG): container finished" podID="b538d385-dcf3-477e-b014-4b304c0be557" containerID="a627f51f7f1973ae313cc61b2d315f415da218be942b825d42f8f6ce498cfa48" exitCode=0 Mar 13 09:36:43 crc kubenswrapper[4841]: I0313 09:36:43.756828 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr" event={"ID":"b538d385-dcf3-477e-b014-4b304c0be557","Type":"ContainerDied","Data":"a627f51f7f1973ae313cc61b2d315f415da218be942b825d42f8f6ce498cfa48"} Mar 13 09:36:44 crc kubenswrapper[4841]: I0313 09:36:44.646652 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q7b5k" Mar 13 09:36:44 crc kubenswrapper[4841]: I0313 09:36:44.697105 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q7b5k" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.171627 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.249297 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b538d385-dcf3-477e-b014-4b304c0be557-inventory\") pod \"b538d385-dcf3-477e-b014-4b304c0be557\" (UID: \"b538d385-dcf3-477e-b014-4b304c0be557\") " Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.249422 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b538d385-dcf3-477e-b014-4b304c0be557-ssh-key-openstack-edpm-ipam\") pod \"b538d385-dcf3-477e-b014-4b304c0be557\" (UID: \"b538d385-dcf3-477e-b014-4b304c0be557\") " Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.249475 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkcsg\" (UniqueName: \"kubernetes.io/projected/b538d385-dcf3-477e-b014-4b304c0be557-kube-api-access-dkcsg\") pod \"b538d385-dcf3-477e-b014-4b304c0be557\" (UID: \"b538d385-dcf3-477e-b014-4b304c0be557\") " Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.260612 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b538d385-dcf3-477e-b014-4b304c0be557-kube-api-access-dkcsg" (OuterVolumeSpecName: "kube-api-access-dkcsg") pod "b538d385-dcf3-477e-b014-4b304c0be557" (UID: "b538d385-dcf3-477e-b014-4b304c0be557"). InnerVolumeSpecName "kube-api-access-dkcsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.279364 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b538d385-dcf3-477e-b014-4b304c0be557-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b538d385-dcf3-477e-b014-4b304c0be557" (UID: "b538d385-dcf3-477e-b014-4b304c0be557"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.291746 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b538d385-dcf3-477e-b014-4b304c0be557-inventory" (OuterVolumeSpecName: "inventory") pod "b538d385-dcf3-477e-b014-4b304c0be557" (UID: "b538d385-dcf3-477e-b014-4b304c0be557"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.351927 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b538d385-dcf3-477e-b014-4b304c0be557-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.352285 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b538d385-dcf3-477e-b014-4b304c0be557-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.352303 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkcsg\" (UniqueName: \"kubernetes.io/projected/b538d385-dcf3-477e-b014-4b304c0be557-kube-api-access-dkcsg\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.529559 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q7b5k"] Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.777746 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr" event={"ID":"b538d385-dcf3-477e-b014-4b304c0be557","Type":"ContainerDied","Data":"b665653457a60b0bb3961dd4b753af7104563401957c5ffdaa9eb43f51f8cf7b"} Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.777796 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-t97xr" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.777797 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b665653457a60b0bb3961dd4b753af7104563401957c5ffdaa9eb43f51f8cf7b" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.778093 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q7b5k" podUID="482d1d76-467c-4222-b6da-f8608636656e" containerName="registry-server" containerID="cri-o://ac4eca77ae148d58e284ebc235a6d9b3b86844e0ea18fbfa1fc2b2b4474583b0" gracePeriod=2 Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.860690 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6"] Mar 13 09:36:45 crc kubenswrapper[4841]: E0313 09:36:45.861163 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b538d385-dcf3-477e-b014-4b304c0be557" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.861182 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b538d385-dcf3-477e-b014-4b304c0be557" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.861417 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b538d385-dcf3-477e-b014-4b304c0be557" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.862193 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.867014 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.867064 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.867331 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.867340 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rv6jt" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.880835 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6"] Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.963346 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3c0bc1a-b192-44f6-a237-9242d36513ce-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6\" (UID: \"c3c0bc1a-b192-44f6-a237-9242d36513ce\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.963408 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmdv5\" (UniqueName: \"kubernetes.io/projected/c3c0bc1a-b192-44f6-a237-9242d36513ce-kube-api-access-wmdv5\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6\" (UID: \"c3c0bc1a-b192-44f6-a237-9242d36513ce\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.963433 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c0bc1a-b192-44f6-a237-9242d36513ce-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6\" (UID: \"c3c0bc1a-b192-44f6-a237-9242d36513ce\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" Mar 13 09:36:45 crc kubenswrapper[4841]: I0313 09:36:45.963468 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c3c0bc1a-b192-44f6-a237-9242d36513ce-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6\" (UID: \"c3c0bc1a-b192-44f6-a237-9242d36513ce\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.065253 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3c0bc1a-b192-44f6-a237-9242d36513ce-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6\" (UID: \"c3c0bc1a-b192-44f6-a237-9242d36513ce\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.065391 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmdv5\" (UniqueName: \"kubernetes.io/projected/c3c0bc1a-b192-44f6-a237-9242d36513ce-kube-api-access-wmdv5\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6\" (UID: \"c3c0bc1a-b192-44f6-a237-9242d36513ce\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.065434 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c0bc1a-b192-44f6-a237-9242d36513ce-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6\" (UID: \"c3c0bc1a-b192-44f6-a237-9242d36513ce\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.065524 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c3c0bc1a-b192-44f6-a237-9242d36513ce-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6\" (UID: \"c3c0bc1a-b192-44f6-a237-9242d36513ce\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.070330 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3c0bc1a-b192-44f6-a237-9242d36513ce-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6\" (UID: \"c3c0bc1a-b192-44f6-a237-9242d36513ce\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.070350 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c3c0bc1a-b192-44f6-a237-9242d36513ce-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6\" (UID: \"c3c0bc1a-b192-44f6-a237-9242d36513ce\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.072016 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c0bc1a-b192-44f6-a237-9242d36513ce-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6\" (UID: \"c3c0bc1a-b192-44f6-a237-9242d36513ce\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.085240 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmdv5\" (UniqueName: \"kubernetes.io/projected/c3c0bc1a-b192-44f6-a237-9242d36513ce-kube-api-access-wmdv5\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6\" (UID: \"c3c0bc1a-b192-44f6-a237-9242d36513ce\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.194708 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7b5k" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.260906 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.269199 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/482d1d76-467c-4222-b6da-f8608636656e-utilities\") pod \"482d1d76-467c-4222-b6da-f8608636656e\" (UID: \"482d1d76-467c-4222-b6da-f8608636656e\") " Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.269289 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvdv4\" (UniqueName: \"kubernetes.io/projected/482d1d76-467c-4222-b6da-f8608636656e-kube-api-access-nvdv4\") pod \"482d1d76-467c-4222-b6da-f8608636656e\" (UID: \"482d1d76-467c-4222-b6da-f8608636656e\") " Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.269567 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/482d1d76-467c-4222-b6da-f8608636656e-catalog-content\") pod \"482d1d76-467c-4222-b6da-f8608636656e\" (UID: \"482d1d76-467c-4222-b6da-f8608636656e\") " Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.271929 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/482d1d76-467c-4222-b6da-f8608636656e-utilities" (OuterVolumeSpecName: "utilities") pod "482d1d76-467c-4222-b6da-f8608636656e" (UID: "482d1d76-467c-4222-b6da-f8608636656e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.274215 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/482d1d76-467c-4222-b6da-f8608636656e-kube-api-access-nvdv4" (OuterVolumeSpecName: "kube-api-access-nvdv4") pod "482d1d76-467c-4222-b6da-f8608636656e" (UID: "482d1d76-467c-4222-b6da-f8608636656e"). InnerVolumeSpecName "kube-api-access-nvdv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.374491 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/482d1d76-467c-4222-b6da-f8608636656e-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.374517 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvdv4\" (UniqueName: \"kubernetes.io/projected/482d1d76-467c-4222-b6da-f8608636656e-kube-api-access-nvdv4\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.470963 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/482d1d76-467c-4222-b6da-f8608636656e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "482d1d76-467c-4222-b6da-f8608636656e" (UID: "482d1d76-467c-4222-b6da-f8608636656e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.476962 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/482d1d76-467c-4222-b6da-f8608636656e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.797048 4841 generic.go:334] "Generic (PLEG): container finished" podID="482d1d76-467c-4222-b6da-f8608636656e" containerID="ac4eca77ae148d58e284ebc235a6d9b3b86844e0ea18fbfa1fc2b2b4474583b0" exitCode=0 Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.797411 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7b5k" event={"ID":"482d1d76-467c-4222-b6da-f8608636656e","Type":"ContainerDied","Data":"ac4eca77ae148d58e284ebc235a6d9b3b86844e0ea18fbfa1fc2b2b4474583b0"} Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.797447 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7b5k" event={"ID":"482d1d76-467c-4222-b6da-f8608636656e","Type":"ContainerDied","Data":"338b10113a7dc6e90f7350ece75ba34b32dc4fdea711bb8e33f27c335dbb9153"} Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.797472 4841 scope.go:117] "RemoveContainer" containerID="ac4eca77ae148d58e284ebc235a6d9b3b86844e0ea18fbfa1fc2b2b4474583b0" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.797670 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6"] Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.797680 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7b5k" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.857608 4841 scope.go:117] "RemoveContainer" containerID="733980a20db22f20f0cb0aacb87aa532babd0a89fe586e19c46774a91b264e60" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.863386 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q7b5k"] Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.873515 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q7b5k"] Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.885453 4841 scope.go:117] "RemoveContainer" containerID="732df105d0cecc5af35a52a05d77005f8e578173e6075cc8e2d23f528da835cc" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.904112 4841 scope.go:117] "RemoveContainer" containerID="ac4eca77ae148d58e284ebc235a6d9b3b86844e0ea18fbfa1fc2b2b4474583b0" Mar 13 09:36:46 crc kubenswrapper[4841]: E0313 09:36:46.904573 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac4eca77ae148d58e284ebc235a6d9b3b86844e0ea18fbfa1fc2b2b4474583b0\": container with ID starting with ac4eca77ae148d58e284ebc235a6d9b3b86844e0ea18fbfa1fc2b2b4474583b0 not found: ID does not exist" containerID="ac4eca77ae148d58e284ebc235a6d9b3b86844e0ea18fbfa1fc2b2b4474583b0" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.904647 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac4eca77ae148d58e284ebc235a6d9b3b86844e0ea18fbfa1fc2b2b4474583b0"} err="failed to get container status \"ac4eca77ae148d58e284ebc235a6d9b3b86844e0ea18fbfa1fc2b2b4474583b0\": rpc error: code = NotFound desc = could not find container \"ac4eca77ae148d58e284ebc235a6d9b3b86844e0ea18fbfa1fc2b2b4474583b0\": container with ID starting with ac4eca77ae148d58e284ebc235a6d9b3b86844e0ea18fbfa1fc2b2b4474583b0 not found: ID does not exist" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.904670 4841 scope.go:117] "RemoveContainer" containerID="733980a20db22f20f0cb0aacb87aa532babd0a89fe586e19c46774a91b264e60" Mar 13 09:36:46 crc kubenswrapper[4841]: E0313 09:36:46.905017 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"733980a20db22f20f0cb0aacb87aa532babd0a89fe586e19c46774a91b264e60\": container with ID starting with 733980a20db22f20f0cb0aacb87aa532babd0a89fe586e19c46774a91b264e60 not found: ID does not exist" containerID="733980a20db22f20f0cb0aacb87aa532babd0a89fe586e19c46774a91b264e60" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.905070 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"733980a20db22f20f0cb0aacb87aa532babd0a89fe586e19c46774a91b264e60"} err="failed to get container status \"733980a20db22f20f0cb0aacb87aa532babd0a89fe586e19c46774a91b264e60\": rpc error: code = NotFound desc = could not find container \"733980a20db22f20f0cb0aacb87aa532babd0a89fe586e19c46774a91b264e60\": container with ID starting with 733980a20db22f20f0cb0aacb87aa532babd0a89fe586e19c46774a91b264e60 not found: ID does not exist" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.905105 4841 scope.go:117] "RemoveContainer" containerID="732df105d0cecc5af35a52a05d77005f8e578173e6075cc8e2d23f528da835cc" Mar 13 09:36:46 crc kubenswrapper[4841]: E0313 09:36:46.905553 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"732df105d0cecc5af35a52a05d77005f8e578173e6075cc8e2d23f528da835cc\": container with ID starting with 732df105d0cecc5af35a52a05d77005f8e578173e6075cc8e2d23f528da835cc not found: ID does not exist" containerID="732df105d0cecc5af35a52a05d77005f8e578173e6075cc8e2d23f528da835cc" Mar 13 09:36:46 crc kubenswrapper[4841]: I0313 09:36:46.905601 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"732df105d0cecc5af35a52a05d77005f8e578173e6075cc8e2d23f528da835cc"} err="failed to get container status \"732df105d0cecc5af35a52a05d77005f8e578173e6075cc8e2d23f528da835cc\": rpc error: code = NotFound desc = could not find container \"732df105d0cecc5af35a52a05d77005f8e578173e6075cc8e2d23f528da835cc\": container with ID starting with 732df105d0cecc5af35a52a05d77005f8e578173e6075cc8e2d23f528da835cc not found: ID does not exist" Mar 13 09:36:47 crc kubenswrapper[4841]: I0313 09:36:47.808070 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" event={"ID":"c3c0bc1a-b192-44f6-a237-9242d36513ce","Type":"ContainerStarted","Data":"3ac9f3a2499944dc2618f216721ede7a97497cc62367b485843d62ed6b6fe018"} Mar 13 09:36:47 crc kubenswrapper[4841]: I0313 09:36:47.808330 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" event={"ID":"c3c0bc1a-b192-44f6-a237-9242d36513ce","Type":"ContainerStarted","Data":"885fce84d177ff1dae3e36161f61e43d3972c20d9eb3a9c19c88e03492df62a1"} Mar 13 09:36:47 crc kubenswrapper[4841]: I0313 09:36:47.826609 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" podStartSLOduration=2.406222167 podStartE2EDuration="2.826584183s" podCreationTimestamp="2026-03-13 09:36:45 +0000 UTC" firstStartedPulling="2026-03-13 09:36:46.801713056 +0000 UTC m=+1489.531613247" lastFinishedPulling="2026-03-13 09:36:47.222075072 +0000 UTC m=+1489.951975263" observedRunningTime="2026-03-13 09:36:47.821814194 +0000 UTC m=+1490.551714415" watchObservedRunningTime="2026-03-13 09:36:47.826584183 +0000 UTC m=+1490.556484374" Mar 13 09:36:48 crc kubenswrapper[4841]: I0313 09:36:48.007943 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="482d1d76-467c-4222-b6da-f8608636656e" path="/var/lib/kubelet/pods/482d1d76-467c-4222-b6da-f8608636656e/volumes" Mar 13 09:36:48 crc kubenswrapper[4841]: I0313 09:36:48.075671 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gmhqg" Mar 13 09:36:48 crc kubenswrapper[4841]: I0313 09:36:48.075735 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gmhqg" Mar 13 09:36:48 crc kubenswrapper[4841]: I0313 09:36:48.144167 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gmhqg" Mar 13 09:36:48 crc kubenswrapper[4841]: I0313 09:36:48.880482 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gmhqg" Mar 13 09:36:49 crc kubenswrapper[4841]: I0313 09:36:49.727829 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gmhqg"] Mar 13 09:36:50 crc kubenswrapper[4841]: I0313 09:36:50.845888 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gmhqg" podUID="f1e82bc6-797b-4b71-9413-014bfe7f7e3c" containerName="registry-server" containerID="cri-o://e603acc4326f42db7af940d886859e57ea81ffac2b1cb53d1c3e68ee000b0a33" gracePeriod=2 Mar 13 09:36:51 crc kubenswrapper[4841]: I0313 09:36:51.319796 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gmhqg" Mar 13 09:36:51 crc kubenswrapper[4841]: I0313 09:36:51.493151 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1e82bc6-797b-4b71-9413-014bfe7f7e3c-utilities\") pod \"f1e82bc6-797b-4b71-9413-014bfe7f7e3c\" (UID: \"f1e82bc6-797b-4b71-9413-014bfe7f7e3c\") " Mar 13 09:36:51 crc kubenswrapper[4841]: I0313 09:36:51.493504 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qkxl\" (UniqueName: \"kubernetes.io/projected/f1e82bc6-797b-4b71-9413-014bfe7f7e3c-kube-api-access-4qkxl\") pod \"f1e82bc6-797b-4b71-9413-014bfe7f7e3c\" (UID: \"f1e82bc6-797b-4b71-9413-014bfe7f7e3c\") " Mar 13 09:36:51 crc kubenswrapper[4841]: I0313 09:36:51.493567 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1e82bc6-797b-4b71-9413-014bfe7f7e3c-catalog-content\") pod \"f1e82bc6-797b-4b71-9413-014bfe7f7e3c\" (UID: \"f1e82bc6-797b-4b71-9413-014bfe7f7e3c\") " Mar 13 09:36:51 crc kubenswrapper[4841]: I0313 09:36:51.494490 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1e82bc6-797b-4b71-9413-014bfe7f7e3c-utilities" (OuterVolumeSpecName: "utilities") pod "f1e82bc6-797b-4b71-9413-014bfe7f7e3c" (UID: "f1e82bc6-797b-4b71-9413-014bfe7f7e3c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:36:51 crc kubenswrapper[4841]: I0313 09:36:51.499721 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1e82bc6-797b-4b71-9413-014bfe7f7e3c-kube-api-access-4qkxl" (OuterVolumeSpecName: "kube-api-access-4qkxl") pod "f1e82bc6-797b-4b71-9413-014bfe7f7e3c" (UID: "f1e82bc6-797b-4b71-9413-014bfe7f7e3c"). InnerVolumeSpecName "kube-api-access-4qkxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:36:51 crc kubenswrapper[4841]: I0313 09:36:51.597051 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1e82bc6-797b-4b71-9413-014bfe7f7e3c-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:51 crc kubenswrapper[4841]: I0313 09:36:51.597079 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qkxl\" (UniqueName: \"kubernetes.io/projected/f1e82bc6-797b-4b71-9413-014bfe7f7e3c-kube-api-access-4qkxl\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:51 crc kubenswrapper[4841]: I0313 09:36:51.866402 4841 generic.go:334] "Generic (PLEG): container finished" podID="f1e82bc6-797b-4b71-9413-014bfe7f7e3c" containerID="e603acc4326f42db7af940d886859e57ea81ffac2b1cb53d1c3e68ee000b0a33" exitCode=0 Mar 13 09:36:51 crc kubenswrapper[4841]: I0313 09:36:51.866538 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gmhqg" Mar 13 09:36:51 crc kubenswrapper[4841]: I0313 09:36:51.868356 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gmhqg" event={"ID":"f1e82bc6-797b-4b71-9413-014bfe7f7e3c","Type":"ContainerDied","Data":"e603acc4326f42db7af940d886859e57ea81ffac2b1cb53d1c3e68ee000b0a33"} Mar 13 09:36:51 crc kubenswrapper[4841]: I0313 09:36:51.868559 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gmhqg" event={"ID":"f1e82bc6-797b-4b71-9413-014bfe7f7e3c","Type":"ContainerDied","Data":"17575f98ec435da6afba123d4ce754b52e7013727d48e9c91449afcc0f263eec"} Mar 13 09:36:51 crc kubenswrapper[4841]: I0313 09:36:51.868648 4841 scope.go:117] "RemoveContainer" containerID="e603acc4326f42db7af940d886859e57ea81ffac2b1cb53d1c3e68ee000b0a33" Mar 13 09:36:51 crc kubenswrapper[4841]: I0313 09:36:51.939448 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1e82bc6-797b-4b71-9413-014bfe7f7e3c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1e82bc6-797b-4b71-9413-014bfe7f7e3c" (UID: "f1e82bc6-797b-4b71-9413-014bfe7f7e3c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:36:51 crc kubenswrapper[4841]: I0313 09:36:51.945601 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1e82bc6-797b-4b71-9413-014bfe7f7e3c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:36:51 crc kubenswrapper[4841]: I0313 09:36:51.952397 4841 scope.go:117] "RemoveContainer" containerID="7288aa6b38550015e2704010267e6e6fc6c2d978015f5c2aeee1c59b1d2d5b62" Mar 13 09:36:52 crc kubenswrapper[4841]: I0313 09:36:52.057427 4841 scope.go:117] "RemoveContainer" containerID="32cdf49a38ca3e0aec210bcbd409088dea83e45e5e7d6bc55ede6e11fbcf0fef" Mar 13 09:36:52 crc kubenswrapper[4841]: I0313 09:36:52.126534 4841 scope.go:117] "RemoveContainer" containerID="e603acc4326f42db7af940d886859e57ea81ffac2b1cb53d1c3e68ee000b0a33" Mar 13 09:36:52 crc kubenswrapper[4841]: E0313 09:36:52.127357 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e603acc4326f42db7af940d886859e57ea81ffac2b1cb53d1c3e68ee000b0a33\": container with ID starting with e603acc4326f42db7af940d886859e57ea81ffac2b1cb53d1c3e68ee000b0a33 not found: ID does not exist" containerID="e603acc4326f42db7af940d886859e57ea81ffac2b1cb53d1c3e68ee000b0a33" Mar 13 09:36:52 crc kubenswrapper[4841]: I0313 09:36:52.127388 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e603acc4326f42db7af940d886859e57ea81ffac2b1cb53d1c3e68ee000b0a33"} err="failed to get container status \"e603acc4326f42db7af940d886859e57ea81ffac2b1cb53d1c3e68ee000b0a33\": rpc error: code = NotFound desc = could not find container \"e603acc4326f42db7af940d886859e57ea81ffac2b1cb53d1c3e68ee000b0a33\": container with ID starting with e603acc4326f42db7af940d886859e57ea81ffac2b1cb53d1c3e68ee000b0a33 not found: ID does not exist" Mar 13 09:36:52 crc kubenswrapper[4841]: I0313 09:36:52.127411 4841 scope.go:117] "RemoveContainer" containerID="7288aa6b38550015e2704010267e6e6fc6c2d978015f5c2aeee1c59b1d2d5b62" Mar 13 09:36:52 crc kubenswrapper[4841]: E0313 09:36:52.127805 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7288aa6b38550015e2704010267e6e6fc6c2d978015f5c2aeee1c59b1d2d5b62\": container with ID starting with 7288aa6b38550015e2704010267e6e6fc6c2d978015f5c2aeee1c59b1d2d5b62 not found: ID does not exist" containerID="7288aa6b38550015e2704010267e6e6fc6c2d978015f5c2aeee1c59b1d2d5b62" Mar 13 09:36:52 crc kubenswrapper[4841]: I0313 09:36:52.127858 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7288aa6b38550015e2704010267e6e6fc6c2d978015f5c2aeee1c59b1d2d5b62"} err="failed to get container status \"7288aa6b38550015e2704010267e6e6fc6c2d978015f5c2aeee1c59b1d2d5b62\": rpc error: code = NotFound desc = could not find container \"7288aa6b38550015e2704010267e6e6fc6c2d978015f5c2aeee1c59b1d2d5b62\": container with ID starting with 7288aa6b38550015e2704010267e6e6fc6c2d978015f5c2aeee1c59b1d2d5b62 not found: ID does not exist" Mar 13 09:36:52 crc kubenswrapper[4841]: I0313 09:36:52.127894 4841 scope.go:117] "RemoveContainer" containerID="32cdf49a38ca3e0aec210bcbd409088dea83e45e5e7d6bc55ede6e11fbcf0fef" Mar 13 09:36:52 crc kubenswrapper[4841]: E0313 09:36:52.128217 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32cdf49a38ca3e0aec210bcbd409088dea83e45e5e7d6bc55ede6e11fbcf0fef\": container with ID starting with 32cdf49a38ca3e0aec210bcbd409088dea83e45e5e7d6bc55ede6e11fbcf0fef not found: ID does not exist" containerID="32cdf49a38ca3e0aec210bcbd409088dea83e45e5e7d6bc55ede6e11fbcf0fef" Mar 13 09:36:52 crc kubenswrapper[4841]: I0313 09:36:52.128243 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32cdf49a38ca3e0aec210bcbd409088dea83e45e5e7d6bc55ede6e11fbcf0fef"} err="failed to get container status \"32cdf49a38ca3e0aec210bcbd409088dea83e45e5e7d6bc55ede6e11fbcf0fef\": rpc error: code = NotFound desc = could not find container \"32cdf49a38ca3e0aec210bcbd409088dea83e45e5e7d6bc55ede6e11fbcf0fef\": container with ID starting with 32cdf49a38ca3e0aec210bcbd409088dea83e45e5e7d6bc55ede6e11fbcf0fef not found: ID does not exist" Mar 13 09:36:52 crc kubenswrapper[4841]: I0313 09:36:52.190294 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gmhqg"] Mar 13 09:36:52 crc kubenswrapper[4841]: I0313 09:36:52.199290 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gmhqg"] Mar 13 09:36:54 crc kubenswrapper[4841]: I0313 09:36:54.016094 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1e82bc6-797b-4b71-9413-014bfe7f7e3c" path="/var/lib/kubelet/pods/f1e82bc6-797b-4b71-9413-014bfe7f7e3c/volumes" Mar 13 09:37:00 crc kubenswrapper[4841]: I0313 09:37:00.319857 4841 scope.go:117] "RemoveContainer" containerID="1d18fee47ef8a12f0ed4e312e138d69c819036bfe89803414f2e9ebb14fb7861" Mar 13 09:37:00 crc kubenswrapper[4841]: I0313 09:37:00.386847 4841 scope.go:117] "RemoveContainer" containerID="6d2d0eef62732abc02d17c8a69f28ec235599c13005aeaa02128dde3b7345457" Mar 13 09:37:00 crc kubenswrapper[4841]: I0313 09:37:00.480951 4841 scope.go:117] "RemoveContainer" containerID="8eb6ff886c84daefc19ad53048a04087959845a3c06e98ed0685d2cf5ed764ab" Mar 13 09:37:00 crc kubenswrapper[4841]: I0313 09:37:00.518554 4841 scope.go:117] "RemoveContainer" containerID="18b82d7d667bf5f2cb4a38afb34a41c960cfe5ff6e84964c33a38c6b1d742611" Mar 13 09:37:34 crc kubenswrapper[4841]: I0313 09:37:34.407969 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:37:34 crc kubenswrapper[4841]: I0313 09:37:34.408691 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.135963 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zsn2m"] Mar 13 09:37:57 crc kubenswrapper[4841]: E0313 09:37:57.136975 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="482d1d76-467c-4222-b6da-f8608636656e" containerName="extract-utilities" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.136989 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="482d1d76-467c-4222-b6da-f8608636656e" containerName="extract-utilities" Mar 13 09:37:57 crc kubenswrapper[4841]: E0313 09:37:57.137005 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1e82bc6-797b-4b71-9413-014bfe7f7e3c" containerName="registry-server" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.137011 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1e82bc6-797b-4b71-9413-014bfe7f7e3c" containerName="registry-server" Mar 13 09:37:57 crc kubenswrapper[4841]: E0313 09:37:57.137033 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1e82bc6-797b-4b71-9413-014bfe7f7e3c" containerName="extract-utilities" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.137040 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1e82bc6-797b-4b71-9413-014bfe7f7e3c" containerName="extract-utilities" Mar 13 09:37:57 crc kubenswrapper[4841]: E0313 09:37:57.137056 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="482d1d76-467c-4222-b6da-f8608636656e" containerName="registry-server" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.137062 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="482d1d76-467c-4222-b6da-f8608636656e" containerName="registry-server" Mar 13 09:37:57 crc kubenswrapper[4841]: E0313 09:37:57.137070 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="482d1d76-467c-4222-b6da-f8608636656e" containerName="extract-content" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.137075 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="482d1d76-467c-4222-b6da-f8608636656e" containerName="extract-content" Mar 13 09:37:57 crc kubenswrapper[4841]: E0313 09:37:57.137097 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1e82bc6-797b-4b71-9413-014bfe7f7e3c" containerName="extract-content" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.137103 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1e82bc6-797b-4b71-9413-014bfe7f7e3c" containerName="extract-content" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.137303 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="482d1d76-467c-4222-b6da-f8608636656e" containerName="registry-server" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.137315 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1e82bc6-797b-4b71-9413-014bfe7f7e3c" containerName="registry-server" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.140302 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zsn2m" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.152450 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zsn2m"] Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.275778 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb-catalog-content\") pod \"redhat-marketplace-zsn2m\" (UID: \"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb\") " pod="openshift-marketplace/redhat-marketplace-zsn2m" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.275881 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb-utilities\") pod \"redhat-marketplace-zsn2m\" (UID: \"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb\") " pod="openshift-marketplace/redhat-marketplace-zsn2m" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.275909 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vk7f\" (UniqueName: \"kubernetes.io/projected/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb-kube-api-access-9vk7f\") pod \"redhat-marketplace-zsn2m\" (UID: \"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb\") " pod="openshift-marketplace/redhat-marketplace-zsn2m" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.377485 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb-catalog-content\") pod \"redhat-marketplace-zsn2m\" (UID: \"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb\") " pod="openshift-marketplace/redhat-marketplace-zsn2m" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.377564 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb-utilities\") pod \"redhat-marketplace-zsn2m\" (UID: \"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb\") " pod="openshift-marketplace/redhat-marketplace-zsn2m" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.377588 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vk7f\" (UniqueName: \"kubernetes.io/projected/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb-kube-api-access-9vk7f\") pod \"redhat-marketplace-zsn2m\" (UID: \"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb\") " pod="openshift-marketplace/redhat-marketplace-zsn2m" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.377954 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb-catalog-content\") pod \"redhat-marketplace-zsn2m\" (UID: \"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb\") " pod="openshift-marketplace/redhat-marketplace-zsn2m" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.378091 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb-utilities\") pod \"redhat-marketplace-zsn2m\" (UID: \"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb\") " pod="openshift-marketplace/redhat-marketplace-zsn2m" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.399708 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vk7f\" (UniqueName: \"kubernetes.io/projected/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb-kube-api-access-9vk7f\") pod \"redhat-marketplace-zsn2m\" (UID: \"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb\") " pod="openshift-marketplace/redhat-marketplace-zsn2m" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.479740 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zsn2m" Mar 13 09:37:57 crc kubenswrapper[4841]: I0313 09:37:57.968478 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zsn2m"] Mar 13 09:37:58 crc kubenswrapper[4841]: I0313 09:37:58.682563 4841 generic.go:334] "Generic (PLEG): container finished" podID="c208d4fa-e4f3-4985-9a3a-571eb08bb9cb" containerID="87ba8f99349c58cba5d8c80d3c5390dd47165393b2e4540caf997fc27b377e4b" exitCode=0 Mar 13 09:37:58 crc kubenswrapper[4841]: I0313 09:37:58.682703 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsn2m" event={"ID":"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb","Type":"ContainerDied","Data":"87ba8f99349c58cba5d8c80d3c5390dd47165393b2e4540caf997fc27b377e4b"} Mar 13 09:37:58 crc kubenswrapper[4841]: I0313 09:37:58.682856 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsn2m" event={"ID":"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb","Type":"ContainerStarted","Data":"23663bad70b75e4cfd508a428591be76073957f48bd0a2c87d162b5cd337392c"} Mar 13 09:37:58 crc kubenswrapper[4841]: I0313 09:37:58.685191 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 09:37:59 crc kubenswrapper[4841]: I0313 09:37:59.695118 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsn2m" event={"ID":"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb","Type":"ContainerStarted","Data":"7d5c495e8e3a9dc2a40f102d653ad25a0acb267820ae36bae8d58e0028f6640e"} Mar 13 09:38:00 crc kubenswrapper[4841]: I0313 09:38:00.169176 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556578-bwlnc"] Mar 13 09:38:00 crc kubenswrapper[4841]: I0313 09:38:00.171613 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556578-bwlnc" Mar 13 09:38:00 crc kubenswrapper[4841]: I0313 09:38:00.175768 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:38:00 crc kubenswrapper[4841]: I0313 09:38:00.176402 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:38:00 crc kubenswrapper[4841]: I0313 09:38:00.183592 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:38:00 crc kubenswrapper[4841]: I0313 09:38:00.214716 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556578-bwlnc"] Mar 13 09:38:00 crc kubenswrapper[4841]: I0313 09:38:00.247797 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7x2f\" (UniqueName: \"kubernetes.io/projected/6acf170a-aaee-446b-be41-ababb7b80365-kube-api-access-p7x2f\") pod \"auto-csr-approver-29556578-bwlnc\" (UID: \"6acf170a-aaee-446b-be41-ababb7b80365\") " pod="openshift-infra/auto-csr-approver-29556578-bwlnc" Mar 13 09:38:00 crc kubenswrapper[4841]: I0313 09:38:00.349927 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7x2f\" (UniqueName: \"kubernetes.io/projected/6acf170a-aaee-446b-be41-ababb7b80365-kube-api-access-p7x2f\") pod \"auto-csr-approver-29556578-bwlnc\" (UID: \"6acf170a-aaee-446b-be41-ababb7b80365\") " pod="openshift-infra/auto-csr-approver-29556578-bwlnc" Mar 13 09:38:00 crc kubenswrapper[4841]: I0313 09:38:00.373776 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7x2f\" (UniqueName: \"kubernetes.io/projected/6acf170a-aaee-446b-be41-ababb7b80365-kube-api-access-p7x2f\") pod \"auto-csr-approver-29556578-bwlnc\" (UID: \"6acf170a-aaee-446b-be41-ababb7b80365\") " pod="openshift-infra/auto-csr-approver-29556578-bwlnc" Mar 13 09:38:00 crc kubenswrapper[4841]: I0313 09:38:00.504884 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556578-bwlnc" Mar 13 09:38:00 crc kubenswrapper[4841]: I0313 09:38:00.667113 4841 scope.go:117] "RemoveContainer" containerID="fcc0f5be86f2b0b69e3cb23458c7bfde9de25b14095a0935003dd99ab7c1dc29" Mar 13 09:38:00 crc kubenswrapper[4841]: I0313 09:38:00.722462 4841 generic.go:334] "Generic (PLEG): container finished" podID="c208d4fa-e4f3-4985-9a3a-571eb08bb9cb" containerID="7d5c495e8e3a9dc2a40f102d653ad25a0acb267820ae36bae8d58e0028f6640e" exitCode=0 Mar 13 09:38:00 crc kubenswrapper[4841]: I0313 09:38:00.722497 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsn2m" event={"ID":"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb","Type":"ContainerDied","Data":"7d5c495e8e3a9dc2a40f102d653ad25a0acb267820ae36bae8d58e0028f6640e"} Mar 13 09:38:00 crc kubenswrapper[4841]: I0313 09:38:00.743414 4841 scope.go:117] "RemoveContainer" containerID="284cd8c49caa3ae8b3d51e24c6a7d55ff566d1bfba2a5b7b1bcfd1c27bcac6fa" Mar 13 09:38:00 crc kubenswrapper[4841]: I0313 09:38:00.795373 4841 scope.go:117] "RemoveContainer" containerID="db01b472907116df9f6b9a6ef92c23f89714edd78737bae1dcc5ecaba5c9325e" Mar 13 09:38:00 crc kubenswrapper[4841]: I0313 09:38:00.989893 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556578-bwlnc"] Mar 13 09:38:00 crc kubenswrapper[4841]: W0313 09:38:00.996832 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6acf170a_aaee_446b_be41_ababb7b80365.slice/crio-44c7a95ed42367e5994d0527fb5c71c39b3cde834e76aa5cac096f760eb9a54d WatchSource:0}: Error finding container 44c7a95ed42367e5994d0527fb5c71c39b3cde834e76aa5cac096f760eb9a54d: Status 404 returned error can't find the container with id 44c7a95ed42367e5994d0527fb5c71c39b3cde834e76aa5cac096f760eb9a54d Mar 13 09:38:01 crc kubenswrapper[4841]: I0313 09:38:01.734075 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsn2m" event={"ID":"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb","Type":"ContainerStarted","Data":"7b48b3be7658a0228cd543439ddd97087235f06b05460e9287e97554c53ff285"} Mar 13 09:38:01 crc kubenswrapper[4841]: I0313 09:38:01.738061 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556578-bwlnc" event={"ID":"6acf170a-aaee-446b-be41-ababb7b80365","Type":"ContainerStarted","Data":"44c7a95ed42367e5994d0527fb5c71c39b3cde834e76aa5cac096f760eb9a54d"} Mar 13 09:38:01 crc kubenswrapper[4841]: I0313 09:38:01.759109 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zsn2m" podStartSLOduration=2.20147139 podStartE2EDuration="4.759092392s" podCreationTimestamp="2026-03-13 09:37:57 +0000 UTC" firstStartedPulling="2026-03-13 09:37:58.684906735 +0000 UTC m=+1561.414806936" lastFinishedPulling="2026-03-13 09:38:01.242527727 +0000 UTC m=+1563.972427938" observedRunningTime="2026-03-13 09:38:01.752593724 +0000 UTC m=+1564.482493935" watchObservedRunningTime="2026-03-13 09:38:01.759092392 +0000 UTC m=+1564.488992583" Mar 13 09:38:02 crc kubenswrapper[4841]: I0313 09:38:02.757288 4841 generic.go:334] "Generic (PLEG): container finished" podID="6acf170a-aaee-446b-be41-ababb7b80365" containerID="7147364091d2179cc7eb600f6bcff6ea373aaab950687515a0dce0a7af9d9277" exitCode=0 Mar 13 09:38:02 crc kubenswrapper[4841]: I0313 09:38:02.757404 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556578-bwlnc" event={"ID":"6acf170a-aaee-446b-be41-ababb7b80365","Type":"ContainerDied","Data":"7147364091d2179cc7eb600f6bcff6ea373aaab950687515a0dce0a7af9d9277"} Mar 13 09:38:04 crc kubenswrapper[4841]: I0313 09:38:04.107422 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556578-bwlnc" Mar 13 09:38:04 crc kubenswrapper[4841]: I0313 09:38:04.227083 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7x2f\" (UniqueName: \"kubernetes.io/projected/6acf170a-aaee-446b-be41-ababb7b80365-kube-api-access-p7x2f\") pod \"6acf170a-aaee-446b-be41-ababb7b80365\" (UID: \"6acf170a-aaee-446b-be41-ababb7b80365\") " Mar 13 09:38:04 crc kubenswrapper[4841]: I0313 09:38:04.238407 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6acf170a-aaee-446b-be41-ababb7b80365-kube-api-access-p7x2f" (OuterVolumeSpecName: "kube-api-access-p7x2f") pod "6acf170a-aaee-446b-be41-ababb7b80365" (UID: "6acf170a-aaee-446b-be41-ababb7b80365"). InnerVolumeSpecName "kube-api-access-p7x2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:38:04 crc kubenswrapper[4841]: I0313 09:38:04.330398 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7x2f\" (UniqueName: \"kubernetes.io/projected/6acf170a-aaee-446b-be41-ababb7b80365-kube-api-access-p7x2f\") on node \"crc\" DevicePath \"\"" Mar 13 09:38:04 crc kubenswrapper[4841]: I0313 09:38:04.407212 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:38:04 crc kubenswrapper[4841]: I0313 09:38:04.407342 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:38:04 crc kubenswrapper[4841]: I0313 09:38:04.778834 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556578-bwlnc" event={"ID":"6acf170a-aaee-446b-be41-ababb7b80365","Type":"ContainerDied","Data":"44c7a95ed42367e5994d0527fb5c71c39b3cde834e76aa5cac096f760eb9a54d"} Mar 13 09:38:04 crc kubenswrapper[4841]: I0313 09:38:04.778873 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44c7a95ed42367e5994d0527fb5c71c39b3cde834e76aa5cac096f760eb9a54d" Mar 13 09:38:04 crc kubenswrapper[4841]: I0313 09:38:04.778897 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556578-bwlnc" Mar 13 09:38:05 crc kubenswrapper[4841]: I0313 09:38:05.183406 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556572-j79gt"] Mar 13 09:38:05 crc kubenswrapper[4841]: I0313 09:38:05.192982 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556572-j79gt"] Mar 13 09:38:06 crc kubenswrapper[4841]: I0313 09:38:06.022845 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b" path="/var/lib/kubelet/pods/b49b5f8c-aca3-4a6f-9edc-cb8485f40c9b/volumes" Mar 13 09:38:07 crc kubenswrapper[4841]: I0313 09:38:07.480151 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zsn2m" Mar 13 09:38:07 crc kubenswrapper[4841]: I0313 09:38:07.480600 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zsn2m" Mar 13 09:38:07 crc kubenswrapper[4841]: I0313 09:38:07.557685 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zsn2m" Mar 13 09:38:07 crc kubenswrapper[4841]: I0313 09:38:07.866872 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zsn2m" Mar 13 09:38:07 crc kubenswrapper[4841]: I0313 09:38:07.949832 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zsn2m"] Mar 13 09:38:09 crc kubenswrapper[4841]: I0313 09:38:09.825971 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zsn2m" podUID="c208d4fa-e4f3-4985-9a3a-571eb08bb9cb" containerName="registry-server" containerID="cri-o://7b48b3be7658a0228cd543439ddd97087235f06b05460e9287e97554c53ff285" gracePeriod=2 Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.282665 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zsn2m" Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.367740 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb-utilities\") pod \"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb\" (UID: \"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb\") " Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.368177 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vk7f\" (UniqueName: \"kubernetes.io/projected/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb-kube-api-access-9vk7f\") pod \"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb\" (UID: \"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb\") " Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.368215 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb-catalog-content\") pod \"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb\" (UID: \"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb\") " Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.369075 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb-utilities" (OuterVolumeSpecName: "utilities") pod "c208d4fa-e4f3-4985-9a3a-571eb08bb9cb" (UID: "c208d4fa-e4f3-4985-9a3a-571eb08bb9cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.374037 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb-kube-api-access-9vk7f" (OuterVolumeSpecName: "kube-api-access-9vk7f") pod "c208d4fa-e4f3-4985-9a3a-571eb08bb9cb" (UID: "c208d4fa-e4f3-4985-9a3a-571eb08bb9cb"). InnerVolumeSpecName "kube-api-access-9vk7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.470354 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c208d4fa-e4f3-4985-9a3a-571eb08bb9cb" (UID: "c208d4fa-e4f3-4985-9a3a-571eb08bb9cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.470837 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.470876 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vk7f\" (UniqueName: \"kubernetes.io/projected/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb-kube-api-access-9vk7f\") on node \"crc\" DevicePath \"\"" Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.470890 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.841544 4841 generic.go:334] "Generic (PLEG): container finished" podID="c208d4fa-e4f3-4985-9a3a-571eb08bb9cb" containerID="7b48b3be7658a0228cd543439ddd97087235f06b05460e9287e97554c53ff285" exitCode=0 Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.841586 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsn2m" event={"ID":"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb","Type":"ContainerDied","Data":"7b48b3be7658a0228cd543439ddd97087235f06b05460e9287e97554c53ff285"} Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.841619 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsn2m" event={"ID":"c208d4fa-e4f3-4985-9a3a-571eb08bb9cb","Type":"ContainerDied","Data":"23663bad70b75e4cfd508a428591be76073957f48bd0a2c87d162b5cd337392c"} Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.841642 4841 scope.go:117] "RemoveContainer" containerID="7b48b3be7658a0228cd543439ddd97087235f06b05460e9287e97554c53ff285" Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.841691 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zsn2m" Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.870140 4841 scope.go:117] "RemoveContainer" containerID="7d5c495e8e3a9dc2a40f102d653ad25a0acb267820ae36bae8d58e0028f6640e" Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.894168 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zsn2m"] Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.904443 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zsn2m"] Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.909576 4841 scope.go:117] "RemoveContainer" containerID="87ba8f99349c58cba5d8c80d3c5390dd47165393b2e4540caf997fc27b377e4b" Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.970000 4841 scope.go:117] "RemoveContainer" containerID="7b48b3be7658a0228cd543439ddd97087235f06b05460e9287e97554c53ff285" Mar 13 09:38:10 crc kubenswrapper[4841]: E0313 09:38:10.970445 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b48b3be7658a0228cd543439ddd97087235f06b05460e9287e97554c53ff285\": container with ID starting with 7b48b3be7658a0228cd543439ddd97087235f06b05460e9287e97554c53ff285 not found: ID does not exist" containerID="7b48b3be7658a0228cd543439ddd97087235f06b05460e9287e97554c53ff285" Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.970479 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b48b3be7658a0228cd543439ddd97087235f06b05460e9287e97554c53ff285"} err="failed to get container status \"7b48b3be7658a0228cd543439ddd97087235f06b05460e9287e97554c53ff285\": rpc error: code = NotFound desc = could not find container \"7b48b3be7658a0228cd543439ddd97087235f06b05460e9287e97554c53ff285\": container with ID starting with 7b48b3be7658a0228cd543439ddd97087235f06b05460e9287e97554c53ff285 not found: ID does not exist" Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.970502 4841 scope.go:117] "RemoveContainer" containerID="7d5c495e8e3a9dc2a40f102d653ad25a0acb267820ae36bae8d58e0028f6640e" Mar 13 09:38:10 crc kubenswrapper[4841]: E0313 09:38:10.970872 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d5c495e8e3a9dc2a40f102d653ad25a0acb267820ae36bae8d58e0028f6640e\": container with ID starting with 7d5c495e8e3a9dc2a40f102d653ad25a0acb267820ae36bae8d58e0028f6640e not found: ID does not exist" containerID="7d5c495e8e3a9dc2a40f102d653ad25a0acb267820ae36bae8d58e0028f6640e" Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.970911 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d5c495e8e3a9dc2a40f102d653ad25a0acb267820ae36bae8d58e0028f6640e"} err="failed to get container status \"7d5c495e8e3a9dc2a40f102d653ad25a0acb267820ae36bae8d58e0028f6640e\": rpc error: code = NotFound desc = could not find container \"7d5c495e8e3a9dc2a40f102d653ad25a0acb267820ae36bae8d58e0028f6640e\": container with ID starting with 7d5c495e8e3a9dc2a40f102d653ad25a0acb267820ae36bae8d58e0028f6640e not found: ID does not exist" Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.970939 4841 scope.go:117] "RemoveContainer" containerID="87ba8f99349c58cba5d8c80d3c5390dd47165393b2e4540caf997fc27b377e4b" Mar 13 09:38:10 crc kubenswrapper[4841]: E0313 09:38:10.971178 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87ba8f99349c58cba5d8c80d3c5390dd47165393b2e4540caf997fc27b377e4b\": container with ID starting with 87ba8f99349c58cba5d8c80d3c5390dd47165393b2e4540caf997fc27b377e4b not found: ID does not exist" containerID="87ba8f99349c58cba5d8c80d3c5390dd47165393b2e4540caf997fc27b377e4b" Mar 13 09:38:10 crc kubenswrapper[4841]: I0313 09:38:10.971204 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ba8f99349c58cba5d8c80d3c5390dd47165393b2e4540caf997fc27b377e4b"} err="failed to get container status \"87ba8f99349c58cba5d8c80d3c5390dd47165393b2e4540caf997fc27b377e4b\": rpc error: code = NotFound desc = could not find container \"87ba8f99349c58cba5d8c80d3c5390dd47165393b2e4540caf997fc27b377e4b\": container with ID starting with 87ba8f99349c58cba5d8c80d3c5390dd47165393b2e4540caf997fc27b377e4b not found: ID does not exist" Mar 13 09:38:12 crc kubenswrapper[4841]: I0313 09:38:12.021225 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c208d4fa-e4f3-4985-9a3a-571eb08bb9cb" path="/var/lib/kubelet/pods/c208d4fa-e4f3-4985-9a3a-571eb08bb9cb/volumes" Mar 13 09:38:34 crc kubenswrapper[4841]: I0313 09:38:34.407772 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:38:34 crc kubenswrapper[4841]: I0313 09:38:34.408285 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:38:34 crc kubenswrapper[4841]: I0313 09:38:34.408330 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:38:34 crc kubenswrapper[4841]: I0313 09:38:34.409048 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab"} pod="openshift-machine-config-operator/machine-config-daemon-h227v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 09:38:34 crc kubenswrapper[4841]: I0313 09:38:34.409104 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" containerID="cri-o://61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" gracePeriod=600 Mar 13 09:38:35 crc kubenswrapper[4841]: E0313 09:38:35.035634 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:38:35 crc kubenswrapper[4841]: I0313 09:38:35.131091 4841 generic.go:334] "Generic (PLEG): container finished" podID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" exitCode=0 Mar 13 09:38:35 crc kubenswrapper[4841]: I0313 09:38:35.131141 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerDied","Data":"61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab"} Mar 13 09:38:35 crc kubenswrapper[4841]: I0313 09:38:35.131186 4841 scope.go:117] "RemoveContainer" containerID="775ebab3cf7b982d36c777cc0cdaea2069ca71dd3ee3f41b99a1b2505417aae0" Mar 13 09:38:35 crc kubenswrapper[4841]: I0313 09:38:35.132185 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:38:35 crc kubenswrapper[4841]: E0313 09:38:35.132804 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:38:49 crc kubenswrapper[4841]: I0313 09:38:49.994948 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:38:49 crc kubenswrapper[4841]: E0313 09:38:49.996231 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:39:00 crc kubenswrapper[4841]: I0313 09:39:00.906309 4841 scope.go:117] "RemoveContainer" containerID="66bcc31aa9c1d0a7b9b151675fa339171f26d1c7fb743676512bbe50a5373c3f" Mar 13 09:39:00 crc kubenswrapper[4841]: I0313 09:39:00.987182 4841 scope.go:117] "RemoveContainer" containerID="4d5adf299cfa5d67f73de4fe6cbdc637423680eeefaa47d2efde069d260da422" Mar 13 09:39:01 crc kubenswrapper[4841]: I0313 09:39:01.015711 4841 scope.go:117] "RemoveContainer" containerID="3063274d726b3acaccdf0b261116fbc7042e9b251eed4b00e63d2cf2ceb19f9f" Mar 13 09:39:01 crc kubenswrapper[4841]: I0313 09:39:01.995098 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:39:01 crc kubenswrapper[4841]: E0313 09:39:01.995728 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:39:14 crc kubenswrapper[4841]: I0313 09:39:14.996804 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:39:14 crc kubenswrapper[4841]: E0313 09:39:14.997988 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:39:28 crc kubenswrapper[4841]: I0313 09:39:28.995912 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:39:28 crc kubenswrapper[4841]: E0313 09:39:28.996850 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:39:39 crc kubenswrapper[4841]: I0313 09:39:39.995130 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:39:39 crc kubenswrapper[4841]: E0313 09:39:39.996217 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:39:50 crc kubenswrapper[4841]: I0313 09:39:50.996023 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:39:50 crc kubenswrapper[4841]: E0313 09:39:50.997549 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:40:00 crc kubenswrapper[4841]: I0313 09:40:00.156914 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556580-t2827"] Mar 13 09:40:00 crc kubenswrapper[4841]: E0313 09:40:00.158141 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c208d4fa-e4f3-4985-9a3a-571eb08bb9cb" containerName="extract-content" Mar 13 09:40:00 crc kubenswrapper[4841]: I0313 09:40:00.158159 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c208d4fa-e4f3-4985-9a3a-571eb08bb9cb" containerName="extract-content" Mar 13 09:40:00 crc kubenswrapper[4841]: E0313 09:40:00.158186 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c208d4fa-e4f3-4985-9a3a-571eb08bb9cb" containerName="extract-utilities" Mar 13 09:40:00 crc kubenswrapper[4841]: I0313 09:40:00.158195 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c208d4fa-e4f3-4985-9a3a-571eb08bb9cb" containerName="extract-utilities" Mar 13 09:40:00 crc kubenswrapper[4841]: E0313 09:40:00.158217 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c208d4fa-e4f3-4985-9a3a-571eb08bb9cb" containerName="registry-server" Mar 13 09:40:00 crc kubenswrapper[4841]: I0313 09:40:00.158225 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c208d4fa-e4f3-4985-9a3a-571eb08bb9cb" containerName="registry-server" Mar 13 09:40:00 crc kubenswrapper[4841]: E0313 09:40:00.158253 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6acf170a-aaee-446b-be41-ababb7b80365" containerName="oc" Mar 13 09:40:00 crc kubenswrapper[4841]: I0313 09:40:00.158287 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6acf170a-aaee-446b-be41-ababb7b80365" containerName="oc" Mar 13 09:40:00 crc kubenswrapper[4841]: I0313 09:40:00.158510 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c208d4fa-e4f3-4985-9a3a-571eb08bb9cb" containerName="registry-server" Mar 13 09:40:00 crc kubenswrapper[4841]: I0313 09:40:00.158525 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="6acf170a-aaee-446b-be41-ababb7b80365" containerName="oc" Mar 13 09:40:00 crc kubenswrapper[4841]: I0313 09:40:00.159309 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556580-t2827" Mar 13 09:40:00 crc kubenswrapper[4841]: I0313 09:40:00.163743 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:40:00 crc kubenswrapper[4841]: I0313 09:40:00.166217 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:40:00 crc kubenswrapper[4841]: I0313 09:40:00.166263 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:40:00 crc kubenswrapper[4841]: I0313 09:40:00.168865 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556580-t2827"] Mar 13 09:40:00 crc kubenswrapper[4841]: I0313 09:40:00.236103 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnspg\" (UniqueName: \"kubernetes.io/projected/30c0eb23-5b61-4053-bd04-3948d2c3eb16-kube-api-access-dnspg\") pod \"auto-csr-approver-29556580-t2827\" (UID: \"30c0eb23-5b61-4053-bd04-3948d2c3eb16\") " pod="openshift-infra/auto-csr-approver-29556580-t2827" Mar 13 09:40:00 crc kubenswrapper[4841]: I0313 09:40:00.338843 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnspg\" (UniqueName: \"kubernetes.io/projected/30c0eb23-5b61-4053-bd04-3948d2c3eb16-kube-api-access-dnspg\") pod \"auto-csr-approver-29556580-t2827\" (UID: \"30c0eb23-5b61-4053-bd04-3948d2c3eb16\") " pod="openshift-infra/auto-csr-approver-29556580-t2827" Mar 13 09:40:00 crc kubenswrapper[4841]: I0313 09:40:00.364468 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnspg\" (UniqueName: \"kubernetes.io/projected/30c0eb23-5b61-4053-bd04-3948d2c3eb16-kube-api-access-dnspg\") pod \"auto-csr-approver-29556580-t2827\" (UID: \"30c0eb23-5b61-4053-bd04-3948d2c3eb16\") " pod="openshift-infra/auto-csr-approver-29556580-t2827" Mar 13 09:40:00 crc kubenswrapper[4841]: I0313 09:40:00.482913 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556580-t2827" Mar 13 09:40:00 crc kubenswrapper[4841]: I0313 09:40:00.964318 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556580-t2827"] Mar 13 09:40:01 crc kubenswrapper[4841]: I0313 09:40:01.043549 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556580-t2827" event={"ID":"30c0eb23-5b61-4053-bd04-3948d2c3eb16","Type":"ContainerStarted","Data":"be6055248819575828fb495cf523c8f209f6a276670ff40b9fdf2fab03f2c8a2"} Mar 13 09:40:01 crc kubenswrapper[4841]: I0313 09:40:01.117735 4841 scope.go:117] "RemoveContainer" containerID="0bbf134e0b93cea1efe344431835ec6357d0c9c7cfe326ef5858d3182d6c239f" Mar 13 09:40:01 crc kubenswrapper[4841]: I0313 09:40:01.151031 4841 scope.go:117] "RemoveContainer" containerID="2ade612b8fb47bd7d9268a94de2da473e20ea71789bfe440788b6e77ec7c883c" Mar 13 09:40:03 crc kubenswrapper[4841]: I0313 09:40:03.078582 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556580-t2827" event={"ID":"30c0eb23-5b61-4053-bd04-3948d2c3eb16","Type":"ContainerStarted","Data":"716229f6ba66e4a5554ec47fc240cc1b304fa03725dc768f4f939f1150ab0220"} Mar 13 09:40:03 crc kubenswrapper[4841]: I0313 09:40:03.101148 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556580-t2827" podStartSLOduration=1.69189433 podStartE2EDuration="3.101120299s" podCreationTimestamp="2026-03-13 09:40:00 +0000 UTC" firstStartedPulling="2026-03-13 09:40:00.965563228 +0000 UTC m=+1683.695463419" lastFinishedPulling="2026-03-13 09:40:02.374789197 +0000 UTC m=+1685.104689388" observedRunningTime="2026-03-13 09:40:03.096235398 +0000 UTC m=+1685.826135629" watchObservedRunningTime="2026-03-13 09:40:03.101120299 +0000 UTC m=+1685.831020520" Mar 13 09:40:04 crc kubenswrapper[4841]: I0313 09:40:04.094523 4841 generic.go:334] "Generic (PLEG): container finished" podID="30c0eb23-5b61-4053-bd04-3948d2c3eb16" containerID="716229f6ba66e4a5554ec47fc240cc1b304fa03725dc768f4f939f1150ab0220" exitCode=0 Mar 13 09:40:04 crc kubenswrapper[4841]: I0313 09:40:04.094573 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556580-t2827" event={"ID":"30c0eb23-5b61-4053-bd04-3948d2c3eb16","Type":"ContainerDied","Data":"716229f6ba66e4a5554ec47fc240cc1b304fa03725dc768f4f939f1150ab0220"} Mar 13 09:40:04 crc kubenswrapper[4841]: I0313 09:40:04.995233 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:40:04 crc kubenswrapper[4841]: E0313 09:40:04.995762 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:40:05 crc kubenswrapper[4841]: I0313 09:40:05.529063 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556580-t2827" Mar 13 09:40:05 crc kubenswrapper[4841]: I0313 09:40:05.647616 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnspg\" (UniqueName: \"kubernetes.io/projected/30c0eb23-5b61-4053-bd04-3948d2c3eb16-kube-api-access-dnspg\") pod \"30c0eb23-5b61-4053-bd04-3948d2c3eb16\" (UID: \"30c0eb23-5b61-4053-bd04-3948d2c3eb16\") " Mar 13 09:40:05 crc kubenswrapper[4841]: I0313 09:40:05.656868 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30c0eb23-5b61-4053-bd04-3948d2c3eb16-kube-api-access-dnspg" (OuterVolumeSpecName: "kube-api-access-dnspg") pod "30c0eb23-5b61-4053-bd04-3948d2c3eb16" (UID: "30c0eb23-5b61-4053-bd04-3948d2c3eb16"). InnerVolumeSpecName "kube-api-access-dnspg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:40:05 crc kubenswrapper[4841]: I0313 09:40:05.750608 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnspg\" (UniqueName: \"kubernetes.io/projected/30c0eb23-5b61-4053-bd04-3948d2c3eb16-kube-api-access-dnspg\") on node \"crc\" DevicePath \"\"" Mar 13 09:40:06 crc kubenswrapper[4841]: I0313 09:40:06.122230 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556580-t2827" event={"ID":"30c0eb23-5b61-4053-bd04-3948d2c3eb16","Type":"ContainerDied","Data":"be6055248819575828fb495cf523c8f209f6a276670ff40b9fdf2fab03f2c8a2"} Mar 13 09:40:06 crc kubenswrapper[4841]: I0313 09:40:06.122305 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be6055248819575828fb495cf523c8f209f6a276670ff40b9fdf2fab03f2c8a2" Mar 13 09:40:06 crc kubenswrapper[4841]: I0313 09:40:06.123328 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556580-t2827" Mar 13 09:40:06 crc kubenswrapper[4841]: I0313 09:40:06.179838 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556574-cvgl5"] Mar 13 09:40:06 crc kubenswrapper[4841]: I0313 09:40:06.190414 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556574-cvgl5"] Mar 13 09:40:08 crc kubenswrapper[4841]: I0313 09:40:08.007483 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acb9502e-cdc5-4a7a-8faa-fe060876b3f2" path="/var/lib/kubelet/pods/acb9502e-cdc5-4a7a-8faa-fe060876b3f2/volumes" Mar 13 09:40:15 crc kubenswrapper[4841]: I0313 09:40:15.996948 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:40:16 crc kubenswrapper[4841]: E0313 09:40:16.000109 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:40:30 crc kubenswrapper[4841]: I0313 09:40:30.995317 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:40:30 crc kubenswrapper[4841]: E0313 09:40:30.997229 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:40:34 crc kubenswrapper[4841]: I0313 09:40:34.437040 4841 generic.go:334] "Generic (PLEG): container finished" podID="c3c0bc1a-b192-44f6-a237-9242d36513ce" containerID="3ac9f3a2499944dc2618f216721ede7a97497cc62367b485843d62ed6b6fe018" exitCode=0 Mar 13 09:40:34 crc kubenswrapper[4841]: I0313 09:40:34.437157 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" event={"ID":"c3c0bc1a-b192-44f6-a237-9242d36513ce","Type":"ContainerDied","Data":"3ac9f3a2499944dc2618f216721ede7a97497cc62367b485843d62ed6b6fe018"} Mar 13 09:40:35 crc kubenswrapper[4841]: I0313 09:40:35.903506 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.009233 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3c0bc1a-b192-44f6-a237-9242d36513ce-inventory\") pod \"c3c0bc1a-b192-44f6-a237-9242d36513ce\" (UID: \"c3c0bc1a-b192-44f6-a237-9242d36513ce\") " Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.009693 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmdv5\" (UniqueName: \"kubernetes.io/projected/c3c0bc1a-b192-44f6-a237-9242d36513ce-kube-api-access-wmdv5\") pod \"c3c0bc1a-b192-44f6-a237-9242d36513ce\" (UID: \"c3c0bc1a-b192-44f6-a237-9242d36513ce\") " Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.009736 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c3c0bc1a-b192-44f6-a237-9242d36513ce-ssh-key-openstack-edpm-ipam\") pod \"c3c0bc1a-b192-44f6-a237-9242d36513ce\" (UID: \"c3c0bc1a-b192-44f6-a237-9242d36513ce\") " Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.009834 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c0bc1a-b192-44f6-a237-9242d36513ce-bootstrap-combined-ca-bundle\") pod \"c3c0bc1a-b192-44f6-a237-9242d36513ce\" (UID: \"c3c0bc1a-b192-44f6-a237-9242d36513ce\") " Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.017790 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3c0bc1a-b192-44f6-a237-9242d36513ce-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c3c0bc1a-b192-44f6-a237-9242d36513ce" (UID: "c3c0bc1a-b192-44f6-a237-9242d36513ce"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.030445 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3c0bc1a-b192-44f6-a237-9242d36513ce-kube-api-access-wmdv5" (OuterVolumeSpecName: "kube-api-access-wmdv5") pod "c3c0bc1a-b192-44f6-a237-9242d36513ce" (UID: "c3c0bc1a-b192-44f6-a237-9242d36513ce"). InnerVolumeSpecName "kube-api-access-wmdv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.050633 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3c0bc1a-b192-44f6-a237-9242d36513ce-inventory" (OuterVolumeSpecName: "inventory") pod "c3c0bc1a-b192-44f6-a237-9242d36513ce" (UID: "c3c0bc1a-b192-44f6-a237-9242d36513ce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.056351 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3c0bc1a-b192-44f6-a237-9242d36513ce-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c3c0bc1a-b192-44f6-a237-9242d36513ce" (UID: "c3c0bc1a-b192-44f6-a237-9242d36513ce"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.112967 4841 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c0bc1a-b192-44f6-a237-9242d36513ce-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.113006 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3c0bc1a-b192-44f6-a237-9242d36513ce-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.113018 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmdv5\" (UniqueName: \"kubernetes.io/projected/c3c0bc1a-b192-44f6-a237-9242d36513ce-kube-api-access-wmdv5\") on node \"crc\" DevicePath \"\"" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.113029 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c3c0bc1a-b192-44f6-a237-9242d36513ce-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.459950 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" event={"ID":"c3c0bc1a-b192-44f6-a237-9242d36513ce","Type":"ContainerDied","Data":"885fce84d177ff1dae3e36161f61e43d3972c20d9eb3a9c19c88e03492df62a1"} Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.459992 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="885fce84d177ff1dae3e36161f61e43d3972c20d9eb3a9c19c88e03492df62a1" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.460058 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.552625 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk"] Mar 13 09:40:36 crc kubenswrapper[4841]: E0313 09:40:36.553013 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3c0bc1a-b192-44f6-a237-9242d36513ce" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.553029 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c0bc1a-b192-44f6-a237-9242d36513ce" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 09:40:36 crc kubenswrapper[4841]: E0313 09:40:36.553045 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c0eb23-5b61-4053-bd04-3948d2c3eb16" containerName="oc" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.553052 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c0eb23-5b61-4053-bd04-3948d2c3eb16" containerName="oc" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.553235 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c0eb23-5b61-4053-bd04-3948d2c3eb16" containerName="oc" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.553255 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3c0bc1a-b192-44f6-a237-9242d36513ce" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.553884 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.556241 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.556993 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rv6jt" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.557092 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.560820 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.565060 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk"] Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.723593 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/511f249d-a5ba-4a19-a5b6-16b5c75fe538-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk\" (UID: \"511f249d-a5ba-4a19-a5b6-16b5c75fe538\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.723799 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87sfl\" (UniqueName: \"kubernetes.io/projected/511f249d-a5ba-4a19-a5b6-16b5c75fe538-kube-api-access-87sfl\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk\" (UID: \"511f249d-a5ba-4a19-a5b6-16b5c75fe538\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.723879 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/511f249d-a5ba-4a19-a5b6-16b5c75fe538-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk\" (UID: \"511f249d-a5ba-4a19-a5b6-16b5c75fe538\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.825780 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87sfl\" (UniqueName: \"kubernetes.io/projected/511f249d-a5ba-4a19-a5b6-16b5c75fe538-kube-api-access-87sfl\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk\" (UID: \"511f249d-a5ba-4a19-a5b6-16b5c75fe538\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.825825 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/511f249d-a5ba-4a19-a5b6-16b5c75fe538-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk\" (UID: \"511f249d-a5ba-4a19-a5b6-16b5c75fe538\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.825957 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/511f249d-a5ba-4a19-a5b6-16b5c75fe538-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk\" (UID: \"511f249d-a5ba-4a19-a5b6-16b5c75fe538\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.829230 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/511f249d-a5ba-4a19-a5b6-16b5c75fe538-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk\" (UID: \"511f249d-a5ba-4a19-a5b6-16b5c75fe538\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.831772 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/511f249d-a5ba-4a19-a5b6-16b5c75fe538-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk\" (UID: \"511f249d-a5ba-4a19-a5b6-16b5c75fe538\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.845678 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87sfl\" (UniqueName: \"kubernetes.io/projected/511f249d-a5ba-4a19-a5b6-16b5c75fe538-kube-api-access-87sfl\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk\" (UID: \"511f249d-a5ba-4a19-a5b6-16b5c75fe538\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk" Mar 13 09:40:36 crc kubenswrapper[4841]: I0313 09:40:36.880334 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk" Mar 13 09:40:37 crc kubenswrapper[4841]: I0313 09:40:37.538966 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk"] Mar 13 09:40:38 crc kubenswrapper[4841]: I0313 09:40:38.482253 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk" event={"ID":"511f249d-a5ba-4a19-a5b6-16b5c75fe538","Type":"ContainerStarted","Data":"918013528b9d345bf5392fde29e9f4e37bbd54b2a5e5f22a309b6f69d0c1f7a7"} Mar 13 09:40:38 crc kubenswrapper[4841]: I0313 09:40:38.482515 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk" event={"ID":"511f249d-a5ba-4a19-a5b6-16b5c75fe538","Type":"ContainerStarted","Data":"396dba008ce594afc8e36bf3a8a251bc842284f6b97c808b788c1fea05895394"} Mar 13 09:40:38 crc kubenswrapper[4841]: I0313 09:40:38.499582 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk" podStartSLOduration=1.972755783 podStartE2EDuration="2.499555963s" podCreationTimestamp="2026-03-13 09:40:36 +0000 UTC" firstStartedPulling="2026-03-13 09:40:37.528704182 +0000 UTC m=+1720.258604363" lastFinishedPulling="2026-03-13 09:40:38.055504352 +0000 UTC m=+1720.785404543" observedRunningTime="2026-03-13 09:40:38.496995415 +0000 UTC m=+1721.226895606" watchObservedRunningTime="2026-03-13 09:40:38.499555963 +0000 UTC m=+1721.229456184" Mar 13 09:40:42 crc kubenswrapper[4841]: I0313 09:40:42.995595 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:40:42 crc kubenswrapper[4841]: E0313 09:40:42.996406 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:40:50 crc kubenswrapper[4841]: I0313 09:40:50.044557 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-x4tz7"] Mar 13 09:40:50 crc kubenswrapper[4841]: I0313 09:40:50.054573 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9d81-account-create-update-kddpx"] Mar 13 09:40:50 crc kubenswrapper[4841]: I0313 09:40:50.066622 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-ptsql"] Mar 13 09:40:50 crc kubenswrapper[4841]: I0313 09:40:50.077120 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-x4tz7"] Mar 13 09:40:50 crc kubenswrapper[4841]: I0313 09:40:50.085282 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3db9-account-create-update-8psnc"] Mar 13 09:40:50 crc kubenswrapper[4841]: I0313 09:40:50.092879 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-9d81-account-create-update-kddpx"] Mar 13 09:40:50 crc kubenswrapper[4841]: I0313 09:40:50.100525 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-ptsql"] Mar 13 09:40:50 crc kubenswrapper[4841]: I0313 09:40:50.107995 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3db9-account-create-update-8psnc"] Mar 13 09:40:52 crc kubenswrapper[4841]: I0313 09:40:52.012216 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e1eddcd-d966-4a33-9b1e-639b3f5ebca7" path="/var/lib/kubelet/pods/3e1eddcd-d966-4a33-9b1e-639b3f5ebca7/volumes" Mar 13 09:40:52 crc kubenswrapper[4841]: I0313 09:40:52.014834 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bc49445-1a38-4ac6-91c8-822d298116a3" path="/var/lib/kubelet/pods/7bc49445-1a38-4ac6-91c8-822d298116a3/volumes" Mar 13 09:40:52 crc kubenswrapper[4841]: I0313 09:40:52.016736 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc60d2f3-4d25-4e39-b7eb-70fb4e750774" path="/var/lib/kubelet/pods/bc60d2f3-4d25-4e39-b7eb-70fb4e750774/volumes" Mar 13 09:40:52 crc kubenswrapper[4841]: I0313 09:40:52.018727 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f59d95da-19a9-42b3-b2f7-ca255806fabb" path="/var/lib/kubelet/pods/f59d95da-19a9-42b3-b2f7-ca255806fabb/volumes" Mar 13 09:40:55 crc kubenswrapper[4841]: I0313 09:40:55.035145 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-dqccr"] Mar 13 09:40:55 crc kubenswrapper[4841]: I0313 09:40:55.045860 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-dqccr"] Mar 13 09:40:55 crc kubenswrapper[4841]: I0313 09:40:55.996076 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:40:55 crc kubenswrapper[4841]: E0313 09:40:55.996827 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:40:56 crc kubenswrapper[4841]: I0313 09:40:56.012040 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78fb8b1f-a395-4430-a2db-267939774965" path="/var/lib/kubelet/pods/78fb8b1f-a395-4430-a2db-267939774965/volumes" Mar 13 09:40:56 crc kubenswrapper[4841]: I0313 09:40:56.033704 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5e0a-account-create-update-tk6x7"] Mar 13 09:40:56 crc kubenswrapper[4841]: I0313 09:40:56.055282 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-78p58"] Mar 13 09:40:56 crc kubenswrapper[4841]: I0313 09:40:56.065982 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5e0a-account-create-update-tk6x7"] Mar 13 09:40:56 crc kubenswrapper[4841]: I0313 09:40:56.075430 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-78p58"] Mar 13 09:40:58 crc kubenswrapper[4841]: I0313 09:40:58.023419 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ae264c3-b8e8-432e-9ee8-8ff04e30aac5" path="/var/lib/kubelet/pods/9ae264c3-b8e8-432e-9ee8-8ff04e30aac5/volumes" Mar 13 09:40:58 crc kubenswrapper[4841]: I0313 09:40:58.024909 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f18b3ca4-294d-46f3-8b5b-c5e297bf58fe" path="/var/lib/kubelet/pods/f18b3ca4-294d-46f3-8b5b-c5e297bf58fe/volumes" Mar 13 09:41:01 crc kubenswrapper[4841]: I0313 09:41:01.224033 4841 scope.go:117] "RemoveContainer" containerID="70740dcd9265a9aa862721c88f0efad402c1d92f5d10f51f4e6782e9c68da467" Mar 13 09:41:01 crc kubenswrapper[4841]: I0313 09:41:01.261411 4841 scope.go:117] "RemoveContainer" containerID="bd33c110c3f9a3d4d4a1f5614a7a700d7b5e2733c8d70d29dd581468d7573150" Mar 13 09:41:01 crc kubenswrapper[4841]: I0313 09:41:01.333121 4841 scope.go:117] "RemoveContainer" containerID="0c7ab363a1a3ee30521c07bd56f5e8a1ed231772685b2c16cf0ba4d2c4e08f46" Mar 13 09:41:01 crc kubenswrapper[4841]: I0313 09:41:01.360827 4841 scope.go:117] "RemoveContainer" containerID="848fba9a8bda88cf73d02a4d99686a491503389ddbfda3664c89191cddea4f79" Mar 13 09:41:01 crc kubenswrapper[4841]: I0313 09:41:01.403584 4841 scope.go:117] "RemoveContainer" containerID="36f959254d7c4fccd4f4c8a169a57252925769952d5640801c2617d5d961f03a" Mar 13 09:41:01 crc kubenswrapper[4841]: I0313 09:41:01.448355 4841 scope.go:117] "RemoveContainer" containerID="c62cdd2b0afd7ccff63f60a98d18f0cf82cee5e50e25e0d75f70ec3af33af351" Mar 13 09:41:01 crc kubenswrapper[4841]: I0313 09:41:01.508822 4841 scope.go:117] "RemoveContainer" containerID="0d995a6d118ccc6959f81ebfa6bca3ac21b379ee754eab4360c291f96d1c8f9c" Mar 13 09:41:01 crc kubenswrapper[4841]: I0313 09:41:01.538481 4841 scope.go:117] "RemoveContainer" containerID="12a87739800af3c8c6623ffe0a1a97e415d15d40e1938f65fa3b03c2da0012ea" Mar 13 09:41:10 crc kubenswrapper[4841]: I0313 09:41:10.994705 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:41:10 crc kubenswrapper[4841]: E0313 09:41:10.995514 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:41:21 crc kubenswrapper[4841]: I0313 09:41:21.060776 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-wkjjs"] Mar 13 09:41:21 crc kubenswrapper[4841]: I0313 09:41:21.071489 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3885-account-create-update-5rctz"] Mar 13 09:41:21 crc kubenswrapper[4841]: I0313 09:41:21.084041 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-cd2cd"] Mar 13 09:41:21 crc kubenswrapper[4841]: I0313 09:41:21.094385 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3885-account-create-update-5rctz"] Mar 13 09:41:21 crc kubenswrapper[4841]: I0313 09:41:21.103502 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-cd2cd"] Mar 13 09:41:21 crc kubenswrapper[4841]: I0313 09:41:21.111034 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-xx9s5"] Mar 13 09:41:21 crc kubenswrapper[4841]: I0313 09:41:21.118421 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-wkjjs"] Mar 13 09:41:21 crc kubenswrapper[4841]: I0313 09:41:21.125426 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-1811-account-create-update-cx59z"] Mar 13 09:41:21 crc kubenswrapper[4841]: I0313 09:41:21.134231 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-0830-account-create-update-8wtfg"] Mar 13 09:41:21 crc kubenswrapper[4841]: I0313 09:41:21.142391 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-xx9s5"] Mar 13 09:41:21 crc kubenswrapper[4841]: I0313 09:41:21.149505 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-1811-account-create-update-cx59z"] Mar 13 09:41:21 crc kubenswrapper[4841]: I0313 09:41:21.156247 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-a114-account-create-update-cx7pv"] Mar 13 09:41:21 crc kubenswrapper[4841]: I0313 09:41:21.163384 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-a114-account-create-update-cx7pv"] Mar 13 09:41:21 crc kubenswrapper[4841]: I0313 09:41:21.171046 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-0830-account-create-update-8wtfg"] Mar 13 09:41:21 crc kubenswrapper[4841]: I0313 09:41:21.177940 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-j4ks6"] Mar 13 09:41:21 crc kubenswrapper[4841]: I0313 09:41:21.185232 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-j4ks6"] Mar 13 09:41:22 crc kubenswrapper[4841]: I0313 09:41:22.015637 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dc06182-654b-4744-ac91-42013c901989" path="/var/lib/kubelet/pods/1dc06182-654b-4744-ac91-42013c901989/volumes" Mar 13 09:41:22 crc kubenswrapper[4841]: I0313 09:41:22.017891 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2448d2f1-6d19-4ef7-8df3-afab14941187" path="/var/lib/kubelet/pods/2448d2f1-6d19-4ef7-8df3-afab14941187/volumes" Mar 13 09:41:22 crc kubenswrapper[4841]: I0313 09:41:22.020091 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8" path="/var/lib/kubelet/pods/3edaaed4-30e7-4ac7-9a22-5e2c9dd078d8/volumes" Mar 13 09:41:22 crc kubenswrapper[4841]: I0313 09:41:22.024696 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54329c28-2ae6-4b02-8b91-b182ef4e0e23" path="/var/lib/kubelet/pods/54329c28-2ae6-4b02-8b91-b182ef4e0e23/volumes" Mar 13 09:41:22 crc kubenswrapper[4841]: I0313 09:41:22.027741 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b564622-1b7c-4e13-9126-e68d7b0ad6fa" path="/var/lib/kubelet/pods/7b564622-1b7c-4e13-9126-e68d7b0ad6fa/volumes" Mar 13 09:41:22 crc kubenswrapper[4841]: I0313 09:41:22.029785 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9834c195-9fa4-4052-b502-85d9992415c5" path="/var/lib/kubelet/pods/9834c195-9fa4-4052-b502-85d9992415c5/volumes" Mar 13 09:41:22 crc kubenswrapper[4841]: I0313 09:41:22.032361 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca7a566b-2ed4-4044-a243-2074a5dcad72" path="/var/lib/kubelet/pods/ca7a566b-2ed4-4044-a243-2074a5dcad72/volumes" Mar 13 09:41:22 crc kubenswrapper[4841]: I0313 09:41:22.037919 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de1be705-392c-4454-81e1-2267d10d1535" path="/var/lib/kubelet/pods/de1be705-392c-4454-81e1-2267d10d1535/volumes" Mar 13 09:41:23 crc kubenswrapper[4841]: I0313 09:41:23.999420 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:41:24 crc kubenswrapper[4841]: E0313 09:41:24.000547 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:41:24 crc kubenswrapper[4841]: I0313 09:41:24.040673 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-b95pq"] Mar 13 09:41:24 crc kubenswrapper[4841]: I0313 09:41:24.055414 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-b95pq"] Mar 13 09:41:26 crc kubenswrapper[4841]: I0313 09:41:26.010540 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ef41142-9432-4e66-9008-a3c1ff35e9a8" path="/var/lib/kubelet/pods/0ef41142-9432-4e66-9008-a3c1ff35e9a8/volumes" Mar 13 09:41:27 crc kubenswrapper[4841]: I0313 09:41:27.032909 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-4kxn8"] Mar 13 09:41:27 crc kubenswrapper[4841]: I0313 09:41:27.043855 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-4kxn8"] Mar 13 09:41:28 crc kubenswrapper[4841]: I0313 09:41:28.016737 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4000c1ec-fd5e-4449-be32-cc39edbf5d10" path="/var/lib/kubelet/pods/4000c1ec-fd5e-4449-be32-cc39edbf5d10/volumes" Mar 13 09:41:38 crc kubenswrapper[4841]: I0313 09:41:38.021474 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:41:38 crc kubenswrapper[4841]: E0313 09:41:38.023814 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:41:48 crc kubenswrapper[4841]: I0313 09:41:48.995803 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:41:48 crc kubenswrapper[4841]: E0313 09:41:48.996771 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:41:55 crc kubenswrapper[4841]: I0313 09:41:55.055418 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-ppvgx"] Mar 13 09:41:55 crc kubenswrapper[4841]: I0313 09:41:55.063320 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-ppvgx"] Mar 13 09:41:56 crc kubenswrapper[4841]: I0313 09:41:56.011257 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e627e4b-0a39-42af-be45-147ff230fd13" path="/var/lib/kubelet/pods/2e627e4b-0a39-42af-be45-147ff230fd13/volumes" Mar 13 09:41:58 crc kubenswrapper[4841]: I0313 09:41:58.345977 4841 generic.go:334] "Generic (PLEG): container finished" podID="511f249d-a5ba-4a19-a5b6-16b5c75fe538" containerID="918013528b9d345bf5392fde29e9f4e37bbd54b2a5e5f22a309b6f69d0c1f7a7" exitCode=0 Mar 13 09:41:58 crc kubenswrapper[4841]: I0313 09:41:58.346242 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk" event={"ID":"511f249d-a5ba-4a19-a5b6-16b5c75fe538","Type":"ContainerDied","Data":"918013528b9d345bf5392fde29e9f4e37bbd54b2a5e5f22a309b6f69d0c1f7a7"} Mar 13 09:41:59 crc kubenswrapper[4841]: I0313 09:41:59.733307 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk" Mar 13 09:41:59 crc kubenswrapper[4841]: I0313 09:41:59.853373 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/511f249d-a5ba-4a19-a5b6-16b5c75fe538-inventory\") pod \"511f249d-a5ba-4a19-a5b6-16b5c75fe538\" (UID: \"511f249d-a5ba-4a19-a5b6-16b5c75fe538\") " Mar 13 09:41:59 crc kubenswrapper[4841]: I0313 09:41:59.853583 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87sfl\" (UniqueName: \"kubernetes.io/projected/511f249d-a5ba-4a19-a5b6-16b5c75fe538-kube-api-access-87sfl\") pod \"511f249d-a5ba-4a19-a5b6-16b5c75fe538\" (UID: \"511f249d-a5ba-4a19-a5b6-16b5c75fe538\") " Mar 13 09:41:59 crc kubenswrapper[4841]: I0313 09:41:59.853692 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/511f249d-a5ba-4a19-a5b6-16b5c75fe538-ssh-key-openstack-edpm-ipam\") pod \"511f249d-a5ba-4a19-a5b6-16b5c75fe538\" (UID: \"511f249d-a5ba-4a19-a5b6-16b5c75fe538\") " Mar 13 09:41:59 crc kubenswrapper[4841]: I0313 09:41:59.861463 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/511f249d-a5ba-4a19-a5b6-16b5c75fe538-kube-api-access-87sfl" (OuterVolumeSpecName: "kube-api-access-87sfl") pod "511f249d-a5ba-4a19-a5b6-16b5c75fe538" (UID: "511f249d-a5ba-4a19-a5b6-16b5c75fe538"). InnerVolumeSpecName "kube-api-access-87sfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:41:59 crc kubenswrapper[4841]: I0313 09:41:59.880554 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/511f249d-a5ba-4a19-a5b6-16b5c75fe538-inventory" (OuterVolumeSpecName: "inventory") pod "511f249d-a5ba-4a19-a5b6-16b5c75fe538" (UID: "511f249d-a5ba-4a19-a5b6-16b5c75fe538"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:41:59 crc kubenswrapper[4841]: I0313 09:41:59.907845 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/511f249d-a5ba-4a19-a5b6-16b5c75fe538-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "511f249d-a5ba-4a19-a5b6-16b5c75fe538" (UID: "511f249d-a5ba-4a19-a5b6-16b5c75fe538"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:41:59 crc kubenswrapper[4841]: I0313 09:41:59.956534 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87sfl\" (UniqueName: \"kubernetes.io/projected/511f249d-a5ba-4a19-a5b6-16b5c75fe538-kube-api-access-87sfl\") on node \"crc\" DevicePath \"\"" Mar 13 09:41:59 crc kubenswrapper[4841]: I0313 09:41:59.956587 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/511f249d-a5ba-4a19-a5b6-16b5c75fe538-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 09:41:59 crc kubenswrapper[4841]: I0313 09:41:59.956610 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/511f249d-a5ba-4a19-a5b6-16b5c75fe538-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.158549 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556582-lpb6v"] Mar 13 09:42:00 crc kubenswrapper[4841]: E0313 09:42:00.159147 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="511f249d-a5ba-4a19-a5b6-16b5c75fe538" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.159174 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="511f249d-a5ba-4a19-a5b6-16b5c75fe538" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.159546 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="511f249d-a5ba-4a19-a5b6-16b5c75fe538" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.160529 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556582-lpb6v" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.168931 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556582-lpb6v"] Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.186315 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.187037 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.191430 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.264705 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf7n6\" (UniqueName: \"kubernetes.io/projected/effcd983-b6c7-4ea1-8aae-7cb08d290ecd-kube-api-access-lf7n6\") pod \"auto-csr-approver-29556582-lpb6v\" (UID: \"effcd983-b6c7-4ea1-8aae-7cb08d290ecd\") " pod="openshift-infra/auto-csr-approver-29556582-lpb6v" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.366430 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf7n6\" (UniqueName: \"kubernetes.io/projected/effcd983-b6c7-4ea1-8aae-7cb08d290ecd-kube-api-access-lf7n6\") pod \"auto-csr-approver-29556582-lpb6v\" (UID: \"effcd983-b6c7-4ea1-8aae-7cb08d290ecd\") " pod="openshift-infra/auto-csr-approver-29556582-lpb6v" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.372668 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk" event={"ID":"511f249d-a5ba-4a19-a5b6-16b5c75fe538","Type":"ContainerDied","Data":"396dba008ce594afc8e36bf3a8a251bc842284f6b97c808b788c1fea05895394"} Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.372713 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.372758 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="396dba008ce594afc8e36bf3a8a251bc842284f6b97c808b788c1fea05895394" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.400792 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf7n6\" (UniqueName: \"kubernetes.io/projected/effcd983-b6c7-4ea1-8aae-7cb08d290ecd-kube-api-access-lf7n6\") pod \"auto-csr-approver-29556582-lpb6v\" (UID: \"effcd983-b6c7-4ea1-8aae-7cb08d290ecd\") " pod="openshift-infra/auto-csr-approver-29556582-lpb6v" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.471187 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd"] Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.472708 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.474676 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.474734 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rv6jt" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.476045 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.476668 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.483074 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd"] Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.505393 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556582-lpb6v" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.570032 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59271a3d-6406-4e1f-a783-ba324ef8dece-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8smcd\" (UID: \"59271a3d-6406-4e1f-a783-ba324ef8dece\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.570178 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ccpm\" (UniqueName: \"kubernetes.io/projected/59271a3d-6406-4e1f-a783-ba324ef8dece-kube-api-access-5ccpm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8smcd\" (UID: \"59271a3d-6406-4e1f-a783-ba324ef8dece\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.570427 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59271a3d-6406-4e1f-a783-ba324ef8dece-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8smcd\" (UID: \"59271a3d-6406-4e1f-a783-ba324ef8dece\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.672476 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59271a3d-6406-4e1f-a783-ba324ef8dece-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8smcd\" (UID: \"59271a3d-6406-4e1f-a783-ba324ef8dece\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.672808 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ccpm\" (UniqueName: \"kubernetes.io/projected/59271a3d-6406-4e1f-a783-ba324ef8dece-kube-api-access-5ccpm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8smcd\" (UID: \"59271a3d-6406-4e1f-a783-ba324ef8dece\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.672883 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59271a3d-6406-4e1f-a783-ba324ef8dece-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8smcd\" (UID: \"59271a3d-6406-4e1f-a783-ba324ef8dece\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.678823 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59271a3d-6406-4e1f-a783-ba324ef8dece-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8smcd\" (UID: \"59271a3d-6406-4e1f-a783-ba324ef8dece\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.680117 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59271a3d-6406-4e1f-a783-ba324ef8dece-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8smcd\" (UID: \"59271a3d-6406-4e1f-a783-ba324ef8dece\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.693998 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ccpm\" (UniqueName: \"kubernetes.io/projected/59271a3d-6406-4e1f-a783-ba324ef8dece-kube-api-access-5ccpm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8smcd\" (UID: \"59271a3d-6406-4e1f-a783-ba324ef8dece\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.801216 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd" Mar 13 09:42:00 crc kubenswrapper[4841]: I0313 09:42:00.939761 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556582-lpb6v"] Mar 13 09:42:01 crc kubenswrapper[4841]: I0313 09:42:01.034346 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-phwrx"] Mar 13 09:42:01 crc kubenswrapper[4841]: I0313 09:42:01.043928 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-phwrx"] Mar 13 09:42:01 crc kubenswrapper[4841]: I0313 09:42:01.301719 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd"] Mar 13 09:42:01 crc kubenswrapper[4841]: W0313 09:42:01.303523 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59271a3d_6406_4e1f_a783_ba324ef8dece.slice/crio-0f37ebca2a71db6a5d4b0cd3da19a8f7290c2c0bbe3882887004fbebdc2aaadb WatchSource:0}: Error finding container 0f37ebca2a71db6a5d4b0cd3da19a8f7290c2c0bbe3882887004fbebdc2aaadb: Status 404 returned error can't find the container with id 0f37ebca2a71db6a5d4b0cd3da19a8f7290c2c0bbe3882887004fbebdc2aaadb Mar 13 09:42:01 crc kubenswrapper[4841]: I0313 09:42:01.382914 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd" event={"ID":"59271a3d-6406-4e1f-a783-ba324ef8dece","Type":"ContainerStarted","Data":"0f37ebca2a71db6a5d4b0cd3da19a8f7290c2c0bbe3882887004fbebdc2aaadb"} Mar 13 09:42:01 crc kubenswrapper[4841]: I0313 09:42:01.385201 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556582-lpb6v" event={"ID":"effcd983-b6c7-4ea1-8aae-7cb08d290ecd","Type":"ContainerStarted","Data":"f218cfe9132c5e3558297abd653559212da12abc061555f7b5c5fb6ee25cecb6"} Mar 13 09:42:01 crc kubenswrapper[4841]: I0313 09:42:01.704032 4841 scope.go:117] "RemoveContainer" containerID="e74b90c92a4bde3385e75d3e43913d62347d14774fa86a3c04c8a05e35485ee6" Mar 13 09:42:01 crc kubenswrapper[4841]: I0313 09:42:01.847851 4841 scope.go:117] "RemoveContainer" containerID="92344e5d03f12c8c0c21d372021c69ea83ee6352a6ff02242725e31c3a6e08f3" Mar 13 09:42:01 crc kubenswrapper[4841]: I0313 09:42:01.887096 4841 scope.go:117] "RemoveContainer" containerID="d56b04b6f31f1967b9c274e0e5556e0c3845582d3fdda5a4c50fa9157ff947a3" Mar 13 09:42:01 crc kubenswrapper[4841]: I0313 09:42:01.931862 4841 scope.go:117] "RemoveContainer" containerID="bc99b53eeb38394454c318b0a27e8e101d60e7d5e85125e266c1b816cf338bca" Mar 13 09:42:01 crc kubenswrapper[4841]: I0313 09:42:01.981109 4841 scope.go:117] "RemoveContainer" containerID="323d1d9cb0e760236c266a8610943ae89ab93b0f61ae6c3b085aaf26f1207cd3" Mar 13 09:42:01 crc kubenswrapper[4841]: I0313 09:42:01.997023 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:42:01 crc kubenswrapper[4841]: E0313 09:42:01.997292 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:42:02 crc kubenswrapper[4841]: I0313 09:42:02.009307 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb20acbd-2346-46cd-baba-089a6afed51b" path="/var/lib/kubelet/pods/cb20acbd-2346-46cd-baba-089a6afed51b/volumes" Mar 13 09:42:02 crc kubenswrapper[4841]: I0313 09:42:02.011258 4841 scope.go:117] "RemoveContainer" containerID="9e8651cd6d1d6733375ececc81f1085226fc3c0f06520d49bf4c191ea2a8d585" Mar 13 09:42:02 crc kubenswrapper[4841]: I0313 09:42:02.033628 4841 scope.go:117] "RemoveContainer" containerID="b8eac2a4b576d9cd640a3e082ae0e384068b44355945524f80c4718c5282858f" Mar 13 09:42:02 crc kubenswrapper[4841]: I0313 09:42:02.063140 4841 scope.go:117] "RemoveContainer" containerID="bd98d39e0f452e9b5fcd5ef818665846f27ea19723cfb7edb01e822b6f0a86ca" Mar 13 09:42:02 crc kubenswrapper[4841]: I0313 09:42:02.092200 4841 scope.go:117] "RemoveContainer" containerID="6b6a3e351fb7f6d9a0b8d502a1bdcd8091852d33875656e47426bdce532fa49c" Mar 13 09:42:02 crc kubenswrapper[4841]: I0313 09:42:02.144348 4841 scope.go:117] "RemoveContainer" containerID="cf51ced3ebcca47dcbef8caa14a2f73160682bb0649309cf74db00642b784378" Mar 13 09:42:02 crc kubenswrapper[4841]: I0313 09:42:02.168325 4841 scope.go:117] "RemoveContainer" containerID="4f78777355731824e1c5d7db6d09729a9d1bdf04886c05c557bb443dd3cfe84a" Mar 13 09:42:02 crc kubenswrapper[4841]: I0313 09:42:02.191702 4841 scope.go:117] "RemoveContainer" containerID="6cc7bf9472252c5035d94042247477285d03bc0eef5eb67e353e7deaba1ac7af" Mar 13 09:42:02 crc kubenswrapper[4841]: I0313 09:42:02.398814 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd" event={"ID":"59271a3d-6406-4e1f-a783-ba324ef8dece","Type":"ContainerStarted","Data":"45a1b44dba53a9ea20e51d002c3529e6dff6c109984c2db5c3f96b4118f216f1"} Mar 13 09:42:02 crc kubenswrapper[4841]: I0313 09:42:02.401163 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556582-lpb6v" event={"ID":"effcd983-b6c7-4ea1-8aae-7cb08d290ecd","Type":"ContainerStarted","Data":"36b951536fffacc6df51e4d6d2cb76edf371f3835583124009f2b487f01abf74"} Mar 13 09:42:02 crc kubenswrapper[4841]: I0313 09:42:02.421396 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd" podStartSLOduration=2.042540104 podStartE2EDuration="2.421373625s" podCreationTimestamp="2026-03-13 09:42:00 +0000 UTC" firstStartedPulling="2026-03-13 09:42:01.306804245 +0000 UTC m=+1804.036704436" lastFinishedPulling="2026-03-13 09:42:01.685637756 +0000 UTC m=+1804.415537957" observedRunningTime="2026-03-13 09:42:02.415905638 +0000 UTC m=+1805.145805829" watchObservedRunningTime="2026-03-13 09:42:02.421373625 +0000 UTC m=+1805.151273826" Mar 13 09:42:02 crc kubenswrapper[4841]: I0313 09:42:02.438132 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556582-lpb6v" podStartSLOduration=1.452389552 podStartE2EDuration="2.438106589s" podCreationTimestamp="2026-03-13 09:42:00 +0000 UTC" firstStartedPulling="2026-03-13 09:42:00.946176253 +0000 UTC m=+1803.676076444" lastFinishedPulling="2026-03-13 09:42:01.93189329 +0000 UTC m=+1804.661793481" observedRunningTime="2026-03-13 09:42:02.430728372 +0000 UTC m=+1805.160628563" watchObservedRunningTime="2026-03-13 09:42:02.438106589 +0000 UTC m=+1805.168006780" Mar 13 09:42:03 crc kubenswrapper[4841]: I0313 09:42:03.419775 4841 generic.go:334] "Generic (PLEG): container finished" podID="effcd983-b6c7-4ea1-8aae-7cb08d290ecd" containerID="36b951536fffacc6df51e4d6d2cb76edf371f3835583124009f2b487f01abf74" exitCode=0 Mar 13 09:42:03 crc kubenswrapper[4841]: I0313 09:42:03.419885 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556582-lpb6v" event={"ID":"effcd983-b6c7-4ea1-8aae-7cb08d290ecd","Type":"ContainerDied","Data":"36b951536fffacc6df51e4d6d2cb76edf371f3835583124009f2b487f01abf74"} Mar 13 09:42:04 crc kubenswrapper[4841]: I0313 09:42:04.051600 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ppjwg"] Mar 13 09:42:04 crc kubenswrapper[4841]: I0313 09:42:04.063987 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ppjwg"] Mar 13 09:42:04 crc kubenswrapper[4841]: I0313 09:42:04.805583 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556582-lpb6v" Mar 13 09:42:04 crc kubenswrapper[4841]: I0313 09:42:04.866036 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf7n6\" (UniqueName: \"kubernetes.io/projected/effcd983-b6c7-4ea1-8aae-7cb08d290ecd-kube-api-access-lf7n6\") pod \"effcd983-b6c7-4ea1-8aae-7cb08d290ecd\" (UID: \"effcd983-b6c7-4ea1-8aae-7cb08d290ecd\") " Mar 13 09:42:04 crc kubenswrapper[4841]: I0313 09:42:04.873029 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/effcd983-b6c7-4ea1-8aae-7cb08d290ecd-kube-api-access-lf7n6" (OuterVolumeSpecName: "kube-api-access-lf7n6") pod "effcd983-b6c7-4ea1-8aae-7cb08d290ecd" (UID: "effcd983-b6c7-4ea1-8aae-7cb08d290ecd"). InnerVolumeSpecName "kube-api-access-lf7n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:42:04 crc kubenswrapper[4841]: I0313 09:42:04.969257 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf7n6\" (UniqueName: \"kubernetes.io/projected/effcd983-b6c7-4ea1-8aae-7cb08d290ecd-kube-api-access-lf7n6\") on node \"crc\" DevicePath \"\"" Mar 13 09:42:05 crc kubenswrapper[4841]: I0313 09:42:05.440860 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556582-lpb6v" event={"ID":"effcd983-b6c7-4ea1-8aae-7cb08d290ecd","Type":"ContainerDied","Data":"f218cfe9132c5e3558297abd653559212da12abc061555f7b5c5fb6ee25cecb6"} Mar 13 09:42:05 crc kubenswrapper[4841]: I0313 09:42:05.441211 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f218cfe9132c5e3558297abd653559212da12abc061555f7b5c5fb6ee25cecb6" Mar 13 09:42:05 crc kubenswrapper[4841]: I0313 09:42:05.440992 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556582-lpb6v" Mar 13 09:42:05 crc kubenswrapper[4841]: I0313 09:42:05.505007 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556576-jbxhq"] Mar 13 09:42:05 crc kubenswrapper[4841]: I0313 09:42:05.516945 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556576-jbxhq"] Mar 13 09:42:06 crc kubenswrapper[4841]: I0313 09:42:06.015694 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a64aff9-f7c3-48cb-8c68-3d0b2208a53e" path="/var/lib/kubelet/pods/0a64aff9-f7c3-48cb-8c68-3d0b2208a53e/volumes" Mar 13 09:42:06 crc kubenswrapper[4841]: I0313 09:42:06.016846 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7007e603-335c-42b3-af69-03ec5159c667" path="/var/lib/kubelet/pods/7007e603-335c-42b3-af69-03ec5159c667/volumes" Mar 13 09:42:12 crc kubenswrapper[4841]: I0313 09:42:12.995338 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:42:12 crc kubenswrapper[4841]: E0313 09:42:12.995936 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:42:13 crc kubenswrapper[4841]: I0313 09:42:13.030435 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-gg7dx"] Mar 13 09:42:13 crc kubenswrapper[4841]: I0313 09:42:13.040797 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-gg7dx"] Mar 13 09:42:14 crc kubenswrapper[4841]: I0313 09:42:14.014776 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9132ca8c-f2de-4025-8462-4899276a8678" path="/var/lib/kubelet/pods/9132ca8c-f2de-4025-8462-4899276a8678/volumes" Mar 13 09:42:20 crc kubenswrapper[4841]: I0313 09:42:20.031076 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-q6zfh"] Mar 13 09:42:20 crc kubenswrapper[4841]: I0313 09:42:20.044005 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-q6zfh"] Mar 13 09:42:21 crc kubenswrapper[4841]: I0313 09:42:21.029149 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-9nnr7"] Mar 13 09:42:21 crc kubenswrapper[4841]: I0313 09:42:21.038818 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-9nnr7"] Mar 13 09:42:22 crc kubenswrapper[4841]: I0313 09:42:22.005163 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2038e7ba-1de4-49b4-95dd-b2f3cde7be45" path="/var/lib/kubelet/pods/2038e7ba-1de4-49b4-95dd-b2f3cde7be45/volumes" Mar 13 09:42:22 crc kubenswrapper[4841]: I0313 09:42:22.007997 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d602e6f-77e5-4496-b426-2c003dad63e4" path="/var/lib/kubelet/pods/5d602e6f-77e5-4496-b426-2c003dad63e4/volumes" Mar 13 09:42:26 crc kubenswrapper[4841]: I0313 09:42:26.996050 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:42:26 crc kubenswrapper[4841]: E0313 09:42:26.996880 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:42:40 crc kubenswrapper[4841]: I0313 09:42:40.995117 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:42:41 crc kubenswrapper[4841]: E0313 09:42:41.001366 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:42:52 crc kubenswrapper[4841]: I0313 09:42:52.995315 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:42:52 crc kubenswrapper[4841]: E0313 09:42:52.996338 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:43:00 crc kubenswrapper[4841]: I0313 09:43:00.970651 4841 generic.go:334] "Generic (PLEG): container finished" podID="59271a3d-6406-4e1f-a783-ba324ef8dece" containerID="45a1b44dba53a9ea20e51d002c3529e6dff6c109984c2db5c3f96b4118f216f1" exitCode=0 Mar 13 09:43:00 crc kubenswrapper[4841]: I0313 09:43:00.970731 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd" event={"ID":"59271a3d-6406-4e1f-a783-ba324ef8dece","Type":"ContainerDied","Data":"45a1b44dba53a9ea20e51d002c3529e6dff6c109984c2db5c3f96b4118f216f1"} Mar 13 09:43:01 crc kubenswrapper[4841]: E0313 09:43:01.100045 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59271a3d_6406_4e1f_a783_ba324ef8dece.slice/crio-45a1b44dba53a9ea20e51d002c3529e6dff6c109984c2db5c3f96b4118f216f1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59271a3d_6406_4e1f_a783_ba324ef8dece.slice/crio-conmon-45a1b44dba53a9ea20e51d002c3529e6dff6c109984c2db5c3f96b4118f216f1.scope\": RecentStats: unable to find data in memory cache]" Mar 13 09:43:02 crc kubenswrapper[4841]: I0313 09:43:02.433617 4841 scope.go:117] "RemoveContainer" containerID="a17e09c453292655b494b3d66f0eac640aa78ef8872bc81089c5f45e55af8957" Mar 13 09:43:03 crc kubenswrapper[4841]: I0313 09:43:03.051936 4841 scope.go:117] "RemoveContainer" containerID="58fdcfcf849ffb97d7153cf927d93c32acae4959d41ed0b6e7563f6a919ce267" Mar 13 09:43:03 crc kubenswrapper[4841]: I0313 09:43:03.106923 4841 scope.go:117] "RemoveContainer" containerID="4603393b1be8277f434e2375f8309099e7b56ec64f63b0d5b6afa699ed810ff5" Mar 13 09:43:03 crc kubenswrapper[4841]: I0313 09:43:03.144031 4841 scope.go:117] "RemoveContainer" containerID="02e39b1af8c6d548176c72640c9cbce8658bd7f178e42c1632b24d2512c3ae3d" Mar 13 09:43:03 crc kubenswrapper[4841]: I0313 09:43:03.246049 4841 scope.go:117] "RemoveContainer" containerID="533ad17ad7245e2e5e9cf12281353640ebedf1c55eb5e83ded41729cecc683ed" Mar 13 09:43:03 crc kubenswrapper[4841]: I0313 09:43:03.260250 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd" Mar 13 09:43:03 crc kubenswrapper[4841]: I0313 09:43:03.301715 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ccpm\" (UniqueName: \"kubernetes.io/projected/59271a3d-6406-4e1f-a783-ba324ef8dece-kube-api-access-5ccpm\") pod \"59271a3d-6406-4e1f-a783-ba324ef8dece\" (UID: \"59271a3d-6406-4e1f-a783-ba324ef8dece\") " Mar 13 09:43:03 crc kubenswrapper[4841]: I0313 09:43:03.301855 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59271a3d-6406-4e1f-a783-ba324ef8dece-inventory\") pod \"59271a3d-6406-4e1f-a783-ba324ef8dece\" (UID: \"59271a3d-6406-4e1f-a783-ba324ef8dece\") " Mar 13 09:43:03 crc kubenswrapper[4841]: I0313 09:43:03.301895 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59271a3d-6406-4e1f-a783-ba324ef8dece-ssh-key-openstack-edpm-ipam\") pod \"59271a3d-6406-4e1f-a783-ba324ef8dece\" (UID: \"59271a3d-6406-4e1f-a783-ba324ef8dece\") " Mar 13 09:43:03 crc kubenswrapper[4841]: I0313 09:43:03.308809 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59271a3d-6406-4e1f-a783-ba324ef8dece-kube-api-access-5ccpm" (OuterVolumeSpecName: "kube-api-access-5ccpm") pod "59271a3d-6406-4e1f-a783-ba324ef8dece" (UID: "59271a3d-6406-4e1f-a783-ba324ef8dece"). InnerVolumeSpecName "kube-api-access-5ccpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:43:03 crc kubenswrapper[4841]: I0313 09:43:03.332614 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59271a3d-6406-4e1f-a783-ba324ef8dece-inventory" (OuterVolumeSpecName: "inventory") pod "59271a3d-6406-4e1f-a783-ba324ef8dece" (UID: "59271a3d-6406-4e1f-a783-ba324ef8dece"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:43:03 crc kubenswrapper[4841]: I0313 09:43:03.334635 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59271a3d-6406-4e1f-a783-ba324ef8dece-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "59271a3d-6406-4e1f-a783-ba324ef8dece" (UID: "59271a3d-6406-4e1f-a783-ba324ef8dece"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:43:03 crc kubenswrapper[4841]: I0313 09:43:03.403330 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59271a3d-6406-4e1f-a783-ba324ef8dece-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 09:43:03 crc kubenswrapper[4841]: I0313 09:43:03.403360 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59271a3d-6406-4e1f-a783-ba324ef8dece-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 09:43:03 crc kubenswrapper[4841]: I0313 09:43:03.403371 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ccpm\" (UniqueName: \"kubernetes.io/projected/59271a3d-6406-4e1f-a783-ba324ef8dece-kube-api-access-5ccpm\") on node \"crc\" DevicePath \"\"" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:03.998595 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.033046 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8smcd" event={"ID":"59271a3d-6406-4e1f-a783-ba324ef8dece","Type":"ContainerDied","Data":"0f37ebca2a71db6a5d4b0cd3da19a8f7290c2c0bbe3882887004fbebdc2aaadb"} Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.033326 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f37ebca2a71db6a5d4b0cd3da19a8f7290c2c0bbe3882887004fbebdc2aaadb" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.378184 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx"] Mar 13 09:43:04 crc kubenswrapper[4841]: E0313 09:43:04.378699 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59271a3d-6406-4e1f-a783-ba324ef8dece" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.378724 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="59271a3d-6406-4e1f-a783-ba324ef8dece" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 09:43:04 crc kubenswrapper[4841]: E0313 09:43:04.378742 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="effcd983-b6c7-4ea1-8aae-7cb08d290ecd" containerName="oc" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.378749 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="effcd983-b6c7-4ea1-8aae-7cb08d290ecd" containerName="oc" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.378966 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="59271a3d-6406-4e1f-a783-ba324ef8dece" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.378996 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="effcd983-b6c7-4ea1-8aae-7cb08d290ecd" containerName="oc" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.379682 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.382823 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.383232 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.383447 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rv6jt" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.389872 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.395314 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx"] Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.422882 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbz2g\" (UniqueName: \"kubernetes.io/projected/a14a214a-62da-44fc-b3d3-749fff9b3645-kube-api-access-xbz2g\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx\" (UID: \"a14a214a-62da-44fc-b3d3-749fff9b3645\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.422960 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a14a214a-62da-44fc-b3d3-749fff9b3645-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx\" (UID: \"a14a214a-62da-44fc-b3d3-749fff9b3645\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.423004 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a14a214a-62da-44fc-b3d3-749fff9b3645-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx\" (UID: \"a14a214a-62da-44fc-b3d3-749fff9b3645\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.524475 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbz2g\" (UniqueName: \"kubernetes.io/projected/a14a214a-62da-44fc-b3d3-749fff9b3645-kube-api-access-xbz2g\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx\" (UID: \"a14a214a-62da-44fc-b3d3-749fff9b3645\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.524802 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a14a214a-62da-44fc-b3d3-749fff9b3645-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx\" (UID: \"a14a214a-62da-44fc-b3d3-749fff9b3645\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.524964 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a14a214a-62da-44fc-b3d3-749fff9b3645-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx\" (UID: \"a14a214a-62da-44fc-b3d3-749fff9b3645\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.529452 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a14a214a-62da-44fc-b3d3-749fff9b3645-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx\" (UID: \"a14a214a-62da-44fc-b3d3-749fff9b3645\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.529633 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a14a214a-62da-44fc-b3d3-749fff9b3645-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx\" (UID: \"a14a214a-62da-44fc-b3d3-749fff9b3645\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.541416 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbz2g\" (UniqueName: \"kubernetes.io/projected/a14a214a-62da-44fc-b3d3-749fff9b3645-kube-api-access-xbz2g\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx\" (UID: \"a14a214a-62da-44fc-b3d3-749fff9b3645\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx" Mar 13 09:43:04 crc kubenswrapper[4841]: I0313 09:43:04.699751 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx" Mar 13 09:43:05 crc kubenswrapper[4841]: I0313 09:43:05.297601 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx"] Mar 13 09:43:05 crc kubenswrapper[4841]: I0313 09:43:05.311771 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 09:43:05 crc kubenswrapper[4841]: I0313 09:43:05.995006 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:43:05 crc kubenswrapper[4841]: E0313 09:43:05.995404 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:43:06 crc kubenswrapper[4841]: I0313 09:43:06.029529 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx" event={"ID":"a14a214a-62da-44fc-b3d3-749fff9b3645","Type":"ContainerStarted","Data":"22613fe34e4c650bbb2e321b6eaf0fe00fa31790f01b8d1c36ad913861914d10"} Mar 13 09:43:06 crc kubenswrapper[4841]: I0313 09:43:06.089123 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-482qc"] Mar 13 09:43:06 crc kubenswrapper[4841]: I0313 09:43:06.104916 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-f075-account-create-update-w4d9x"] Mar 13 09:43:06 crc kubenswrapper[4841]: I0313 09:43:06.117856 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-xh558"] Mar 13 09:43:06 crc kubenswrapper[4841]: I0313 09:43:06.125465 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-f075-account-create-update-w4d9x"] Mar 13 09:43:06 crc kubenswrapper[4841]: I0313 09:43:06.136776 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-482qc"] Mar 13 09:43:06 crc kubenswrapper[4841]: I0313 09:43:06.145643 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-xh558"] Mar 13 09:43:07 crc kubenswrapper[4841]: I0313 09:43:07.039333 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-d577-account-create-update-pj2d2"] Mar 13 09:43:07 crc kubenswrapper[4841]: I0313 09:43:07.043369 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx" event={"ID":"a14a214a-62da-44fc-b3d3-749fff9b3645","Type":"ContainerStarted","Data":"0e4dc1b477d024df7a6d2053ba0529c82d3e344e1d4a45a9c0b4aa3dbbeda7ee"} Mar 13 09:43:07 crc kubenswrapper[4841]: I0313 09:43:07.063696 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-vx4xd"] Mar 13 09:43:07 crc kubenswrapper[4841]: I0313 09:43:07.078491 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e604-account-create-update-cf2sr"] Mar 13 09:43:07 crc kubenswrapper[4841]: I0313 09:43:07.089740 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-e604-account-create-update-cf2sr"] Mar 13 09:43:07 crc kubenswrapper[4841]: I0313 09:43:07.102990 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-d577-account-create-update-pj2d2"] Mar 13 09:43:07 crc kubenswrapper[4841]: I0313 09:43:07.110627 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-vx4xd"] Mar 13 09:43:07 crc kubenswrapper[4841]: I0313 09:43:07.115289 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx" podStartSLOduration=2.392704125 podStartE2EDuration="3.115251549s" podCreationTimestamp="2026-03-13 09:43:04 +0000 UTC" firstStartedPulling="2026-03-13 09:43:05.311488468 +0000 UTC m=+1868.041388659" lastFinishedPulling="2026-03-13 09:43:06.034035892 +0000 UTC m=+1868.763936083" observedRunningTime="2026-03-13 09:43:07.059986224 +0000 UTC m=+1869.789886425" watchObservedRunningTime="2026-03-13 09:43:07.115251549 +0000 UTC m=+1869.845151740" Mar 13 09:43:08 crc kubenswrapper[4841]: I0313 09:43:08.013398 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10d1db8e-6619-436b-b998-cea606b30b5c" path="/var/lib/kubelet/pods/10d1db8e-6619-436b-b998-cea606b30b5c/volumes" Mar 13 09:43:08 crc kubenswrapper[4841]: I0313 09:43:08.015021 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10eed5d3-9fc4-4322-b9ef-c299a32a94bb" path="/var/lib/kubelet/pods/10eed5d3-9fc4-4322-b9ef-c299a32a94bb/volumes" Mar 13 09:43:08 crc kubenswrapper[4841]: I0313 09:43:08.015976 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69e96c8f-e077-499b-9b31-3984ad159364" path="/var/lib/kubelet/pods/69e96c8f-e077-499b-9b31-3984ad159364/volumes" Mar 13 09:43:08 crc kubenswrapper[4841]: I0313 09:43:08.017928 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ea1b844-20ae-4d19-8df2-3d472391080a" path="/var/lib/kubelet/pods/8ea1b844-20ae-4d19-8df2-3d472391080a/volumes" Mar 13 09:43:08 crc kubenswrapper[4841]: I0313 09:43:08.019609 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f9980f7-5596-4002-b087-04cb53c6a78a" path="/var/lib/kubelet/pods/9f9980f7-5596-4002-b087-04cb53c6a78a/volumes" Mar 13 09:43:08 crc kubenswrapper[4841]: I0313 09:43:08.020536 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf0add86-8160-48dd-a9b5-bef3edcf4810" path="/var/lib/kubelet/pods/bf0add86-8160-48dd-a9b5-bef3edcf4810/volumes" Mar 13 09:43:11 crc kubenswrapper[4841]: I0313 09:43:11.083469 4841 generic.go:334] "Generic (PLEG): container finished" podID="a14a214a-62da-44fc-b3d3-749fff9b3645" containerID="0e4dc1b477d024df7a6d2053ba0529c82d3e344e1d4a45a9c0b4aa3dbbeda7ee" exitCode=0 Mar 13 09:43:11 crc kubenswrapper[4841]: I0313 09:43:11.083520 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx" event={"ID":"a14a214a-62da-44fc-b3d3-749fff9b3645","Type":"ContainerDied","Data":"0e4dc1b477d024df7a6d2053ba0529c82d3e344e1d4a45a9c0b4aa3dbbeda7ee"} Mar 13 09:43:12 crc kubenswrapper[4841]: I0313 09:43:12.590281 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx" Mar 13 09:43:12 crc kubenswrapper[4841]: I0313 09:43:12.634127 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a14a214a-62da-44fc-b3d3-749fff9b3645-inventory\") pod \"a14a214a-62da-44fc-b3d3-749fff9b3645\" (UID: \"a14a214a-62da-44fc-b3d3-749fff9b3645\") " Mar 13 09:43:12 crc kubenswrapper[4841]: I0313 09:43:12.634179 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a14a214a-62da-44fc-b3d3-749fff9b3645-ssh-key-openstack-edpm-ipam\") pod \"a14a214a-62da-44fc-b3d3-749fff9b3645\" (UID: \"a14a214a-62da-44fc-b3d3-749fff9b3645\") " Mar 13 09:43:12 crc kubenswrapper[4841]: I0313 09:43:12.634428 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbz2g\" (UniqueName: \"kubernetes.io/projected/a14a214a-62da-44fc-b3d3-749fff9b3645-kube-api-access-xbz2g\") pod \"a14a214a-62da-44fc-b3d3-749fff9b3645\" (UID: \"a14a214a-62da-44fc-b3d3-749fff9b3645\") " Mar 13 09:43:12 crc kubenswrapper[4841]: I0313 09:43:12.643718 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a14a214a-62da-44fc-b3d3-749fff9b3645-kube-api-access-xbz2g" (OuterVolumeSpecName: "kube-api-access-xbz2g") pod "a14a214a-62da-44fc-b3d3-749fff9b3645" (UID: "a14a214a-62da-44fc-b3d3-749fff9b3645"). InnerVolumeSpecName "kube-api-access-xbz2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:43:12 crc kubenswrapper[4841]: I0313 09:43:12.671709 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a14a214a-62da-44fc-b3d3-749fff9b3645-inventory" (OuterVolumeSpecName: "inventory") pod "a14a214a-62da-44fc-b3d3-749fff9b3645" (UID: "a14a214a-62da-44fc-b3d3-749fff9b3645"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:43:12 crc kubenswrapper[4841]: I0313 09:43:12.672135 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a14a214a-62da-44fc-b3d3-749fff9b3645-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a14a214a-62da-44fc-b3d3-749fff9b3645" (UID: "a14a214a-62da-44fc-b3d3-749fff9b3645"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:43:12 crc kubenswrapper[4841]: I0313 09:43:12.736539 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbz2g\" (UniqueName: \"kubernetes.io/projected/a14a214a-62da-44fc-b3d3-749fff9b3645-kube-api-access-xbz2g\") on node \"crc\" DevicePath \"\"" Mar 13 09:43:12 crc kubenswrapper[4841]: I0313 09:43:12.736914 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a14a214a-62da-44fc-b3d3-749fff9b3645-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 09:43:12 crc kubenswrapper[4841]: I0313 09:43:12.736933 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a14a214a-62da-44fc-b3d3-749fff9b3645-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.105946 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx" event={"ID":"a14a214a-62da-44fc-b3d3-749fff9b3645","Type":"ContainerDied","Data":"22613fe34e4c650bbb2e321b6eaf0fe00fa31790f01b8d1c36ad913861914d10"} Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.105985 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22613fe34e4c650bbb2e321b6eaf0fe00fa31790f01b8d1c36ad913861914d10" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.106017 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.187075 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw"] Mar 13 09:43:13 crc kubenswrapper[4841]: E0313 09:43:13.187747 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a14a214a-62da-44fc-b3d3-749fff9b3645" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.187791 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a14a214a-62da-44fc-b3d3-749fff9b3645" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.188154 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a14a214a-62da-44fc-b3d3-749fff9b3645" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.189105 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.192504 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.192723 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.192913 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rv6jt" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.193097 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.202974 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw"] Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.246535 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e611c0f-aa46-4280-ae2d-bdff4bf61b60-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-szndw\" (UID: \"6e611c0f-aa46-4280-ae2d-bdff4bf61b60\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.246633 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e611c0f-aa46-4280-ae2d-bdff4bf61b60-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-szndw\" (UID: \"6e611c0f-aa46-4280-ae2d-bdff4bf61b60\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.246703 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnh7g\" (UniqueName: \"kubernetes.io/projected/6e611c0f-aa46-4280-ae2d-bdff4bf61b60-kube-api-access-qnh7g\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-szndw\" (UID: \"6e611c0f-aa46-4280-ae2d-bdff4bf61b60\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.348298 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e611c0f-aa46-4280-ae2d-bdff4bf61b60-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-szndw\" (UID: \"6e611c0f-aa46-4280-ae2d-bdff4bf61b60\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.348387 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e611c0f-aa46-4280-ae2d-bdff4bf61b60-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-szndw\" (UID: \"6e611c0f-aa46-4280-ae2d-bdff4bf61b60\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.348454 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnh7g\" (UniqueName: \"kubernetes.io/projected/6e611c0f-aa46-4280-ae2d-bdff4bf61b60-kube-api-access-qnh7g\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-szndw\" (UID: \"6e611c0f-aa46-4280-ae2d-bdff4bf61b60\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.352050 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e611c0f-aa46-4280-ae2d-bdff4bf61b60-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-szndw\" (UID: \"6e611c0f-aa46-4280-ae2d-bdff4bf61b60\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.352517 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e611c0f-aa46-4280-ae2d-bdff4bf61b60-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-szndw\" (UID: \"6e611c0f-aa46-4280-ae2d-bdff4bf61b60\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.364548 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnh7g\" (UniqueName: \"kubernetes.io/projected/6e611c0f-aa46-4280-ae2d-bdff4bf61b60-kube-api-access-qnh7g\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-szndw\" (UID: \"6e611c0f-aa46-4280-ae2d-bdff4bf61b60\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw" Mar 13 09:43:13 crc kubenswrapper[4841]: I0313 09:43:13.520292 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw" Mar 13 09:43:14 crc kubenswrapper[4841]: I0313 09:43:14.067876 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw"] Mar 13 09:43:14 crc kubenswrapper[4841]: I0313 09:43:14.114013 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw" event={"ID":"6e611c0f-aa46-4280-ae2d-bdff4bf61b60","Type":"ContainerStarted","Data":"6bb47d77b9d5abee3be47c47246481782aee02f68f56b5204f01635a1633c91e"} Mar 13 09:43:15 crc kubenswrapper[4841]: I0313 09:43:15.126570 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw" event={"ID":"6e611c0f-aa46-4280-ae2d-bdff4bf61b60","Type":"ContainerStarted","Data":"8b12d0e57e42cf1c4211564562b99dd5c3ad487ce88b87212ebbd5815ebd1d6e"} Mar 13 09:43:15 crc kubenswrapper[4841]: I0313 09:43:15.163150 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw" podStartSLOduration=1.723277194 podStartE2EDuration="2.163120596s" podCreationTimestamp="2026-03-13 09:43:13 +0000 UTC" firstStartedPulling="2026-03-13 09:43:14.074198292 +0000 UTC m=+1876.804098483" lastFinishedPulling="2026-03-13 09:43:14.514041694 +0000 UTC m=+1877.243941885" observedRunningTime="2026-03-13 09:43:15.14337912 +0000 UTC m=+1877.873279341" watchObservedRunningTime="2026-03-13 09:43:15.163120596 +0000 UTC m=+1877.893020797" Mar 13 09:43:20 crc kubenswrapper[4841]: I0313 09:43:20.995716 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:43:20 crc kubenswrapper[4841]: E0313 09:43:20.996519 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:43:35 crc kubenswrapper[4841]: I0313 09:43:35.995529 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:43:36 crc kubenswrapper[4841]: I0313 09:43:36.314570 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"fcd17e9cebe8ee5c7b1749619cc08d15d1f1b747829783b5902c03af4c6172bc"} Mar 13 09:43:37 crc kubenswrapper[4841]: I0313 09:43:37.051370 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-24ghd"] Mar 13 09:43:37 crc kubenswrapper[4841]: I0313 09:43:37.065754 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-24ghd"] Mar 13 09:43:38 crc kubenswrapper[4841]: I0313 09:43:38.015087 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7def500c-116f-47bb-b58d-23e07d7a0771" path="/var/lib/kubelet/pods/7def500c-116f-47bb-b58d-23e07d7a0771/volumes" Mar 13 09:43:47 crc kubenswrapper[4841]: I0313 09:43:47.409628 4841 generic.go:334] "Generic (PLEG): container finished" podID="6e611c0f-aa46-4280-ae2d-bdff4bf61b60" containerID="8b12d0e57e42cf1c4211564562b99dd5c3ad487ce88b87212ebbd5815ebd1d6e" exitCode=0 Mar 13 09:43:47 crc kubenswrapper[4841]: I0313 09:43:47.409715 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw" event={"ID":"6e611c0f-aa46-4280-ae2d-bdff4bf61b60","Type":"ContainerDied","Data":"8b12d0e57e42cf1c4211564562b99dd5c3ad487ce88b87212ebbd5815ebd1d6e"} Mar 13 09:43:48 crc kubenswrapper[4841]: I0313 09:43:48.843341 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw" Mar 13 09:43:48 crc kubenswrapper[4841]: I0313 09:43:48.930126 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e611c0f-aa46-4280-ae2d-bdff4bf61b60-ssh-key-openstack-edpm-ipam\") pod \"6e611c0f-aa46-4280-ae2d-bdff4bf61b60\" (UID: \"6e611c0f-aa46-4280-ae2d-bdff4bf61b60\") " Mar 13 09:43:48 crc kubenswrapper[4841]: I0313 09:43:48.930178 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnh7g\" (UniqueName: \"kubernetes.io/projected/6e611c0f-aa46-4280-ae2d-bdff4bf61b60-kube-api-access-qnh7g\") pod \"6e611c0f-aa46-4280-ae2d-bdff4bf61b60\" (UID: \"6e611c0f-aa46-4280-ae2d-bdff4bf61b60\") " Mar 13 09:43:48 crc kubenswrapper[4841]: I0313 09:43:48.930257 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e611c0f-aa46-4280-ae2d-bdff4bf61b60-inventory\") pod \"6e611c0f-aa46-4280-ae2d-bdff4bf61b60\" (UID: \"6e611c0f-aa46-4280-ae2d-bdff4bf61b60\") " Mar 13 09:43:48 crc kubenswrapper[4841]: I0313 09:43:48.935966 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e611c0f-aa46-4280-ae2d-bdff4bf61b60-kube-api-access-qnh7g" (OuterVolumeSpecName: "kube-api-access-qnh7g") pod "6e611c0f-aa46-4280-ae2d-bdff4bf61b60" (UID: "6e611c0f-aa46-4280-ae2d-bdff4bf61b60"). InnerVolumeSpecName "kube-api-access-qnh7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:43:48 crc kubenswrapper[4841]: I0313 09:43:48.957094 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e611c0f-aa46-4280-ae2d-bdff4bf61b60-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6e611c0f-aa46-4280-ae2d-bdff4bf61b60" (UID: "6e611c0f-aa46-4280-ae2d-bdff4bf61b60"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:43:48 crc kubenswrapper[4841]: I0313 09:43:48.969777 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e611c0f-aa46-4280-ae2d-bdff4bf61b60-inventory" (OuterVolumeSpecName: "inventory") pod "6e611c0f-aa46-4280-ae2d-bdff4bf61b60" (UID: "6e611c0f-aa46-4280-ae2d-bdff4bf61b60"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.034489 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e611c0f-aa46-4280-ae2d-bdff4bf61b60-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.034541 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnh7g\" (UniqueName: \"kubernetes.io/projected/6e611c0f-aa46-4280-ae2d-bdff4bf61b60-kube-api-access-qnh7g\") on node \"crc\" DevicePath \"\"" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.034556 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e611c0f-aa46-4280-ae2d-bdff4bf61b60-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.435457 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw" event={"ID":"6e611c0f-aa46-4280-ae2d-bdff4bf61b60","Type":"ContainerDied","Data":"6bb47d77b9d5abee3be47c47246481782aee02f68f56b5204f01635a1633c91e"} Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.435511 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-szndw" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.435528 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bb47d77b9d5abee3be47c47246481782aee02f68f56b5204f01635a1633c91e" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.550644 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p"] Mar 13 09:43:49 crc kubenswrapper[4841]: E0313 09:43:49.551023 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e611c0f-aa46-4280-ae2d-bdff4bf61b60" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.551041 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e611c0f-aa46-4280-ae2d-bdff4bf61b60" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.551221 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e611c0f-aa46-4280-ae2d-bdff4bf61b60" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.551809 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.555434 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.555703 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.556548 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.557252 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rv6jt" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.572871 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p"] Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.647179 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3c90f3c-6382-4a13-b4cd-515cfe68538e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p\" (UID: \"f3c90f3c-6382-4a13-b4cd-515cfe68538e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.647298 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3c90f3c-6382-4a13-b4cd-515cfe68538e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p\" (UID: \"f3c90f3c-6382-4a13-b4cd-515cfe68538e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.647324 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgmhl\" (UniqueName: \"kubernetes.io/projected/f3c90f3c-6382-4a13-b4cd-515cfe68538e-kube-api-access-rgmhl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p\" (UID: \"f3c90f3c-6382-4a13-b4cd-515cfe68538e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.749440 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3c90f3c-6382-4a13-b4cd-515cfe68538e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p\" (UID: \"f3c90f3c-6382-4a13-b4cd-515cfe68538e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.749501 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgmhl\" (UniqueName: \"kubernetes.io/projected/f3c90f3c-6382-4a13-b4cd-515cfe68538e-kube-api-access-rgmhl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p\" (UID: \"f3c90f3c-6382-4a13-b4cd-515cfe68538e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.749705 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3c90f3c-6382-4a13-b4cd-515cfe68538e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p\" (UID: \"f3c90f3c-6382-4a13-b4cd-515cfe68538e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.753828 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3c90f3c-6382-4a13-b4cd-515cfe68538e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p\" (UID: \"f3c90f3c-6382-4a13-b4cd-515cfe68538e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.755646 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3c90f3c-6382-4a13-b4cd-515cfe68538e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p\" (UID: \"f3c90f3c-6382-4a13-b4cd-515cfe68538e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.777030 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgmhl\" (UniqueName: \"kubernetes.io/projected/f3c90f3c-6382-4a13-b4cd-515cfe68538e-kube-api-access-rgmhl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p\" (UID: \"f3c90f3c-6382-4a13-b4cd-515cfe68538e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p" Mar 13 09:43:49 crc kubenswrapper[4841]: I0313 09:43:49.874777 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p" Mar 13 09:43:50 crc kubenswrapper[4841]: W0313 09:43:50.427345 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3c90f3c_6382_4a13_b4cd_515cfe68538e.slice/crio-a5b9e9081c1bcfdf772ad41869a2f1797da357aa6c54fb1bf24bd7702e866acb WatchSource:0}: Error finding container a5b9e9081c1bcfdf772ad41869a2f1797da357aa6c54fb1bf24bd7702e866acb: Status 404 returned error can't find the container with id a5b9e9081c1bcfdf772ad41869a2f1797da357aa6c54fb1bf24bd7702e866acb Mar 13 09:43:50 crc kubenswrapper[4841]: I0313 09:43:50.438594 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p"] Mar 13 09:43:50 crc kubenswrapper[4841]: I0313 09:43:50.450350 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p" event={"ID":"f3c90f3c-6382-4a13-b4cd-515cfe68538e","Type":"ContainerStarted","Data":"a5b9e9081c1bcfdf772ad41869a2f1797da357aa6c54fb1bf24bd7702e866acb"} Mar 13 09:43:51 crc kubenswrapper[4841]: I0313 09:43:51.467034 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p" event={"ID":"f3c90f3c-6382-4a13-b4cd-515cfe68538e","Type":"ContainerStarted","Data":"9d74581213b82af9df923a07e95c3fc810ef413ca04f52c0e725f4b96b5e6362"} Mar 13 09:43:51 crc kubenswrapper[4841]: I0313 09:43:51.497666 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p" podStartSLOduration=2.009171367 podStartE2EDuration="2.497645771s" podCreationTimestamp="2026-03-13 09:43:49 +0000 UTC" firstStartedPulling="2026-03-13 09:43:50.431058323 +0000 UTC m=+1913.160958514" lastFinishedPulling="2026-03-13 09:43:50.919532717 +0000 UTC m=+1913.649432918" observedRunningTime="2026-03-13 09:43:51.48324897 +0000 UTC m=+1914.213149181" watchObservedRunningTime="2026-03-13 09:43:51.497645771 +0000 UTC m=+1914.227545972" Mar 13 09:44:00 crc kubenswrapper[4841]: I0313 09:44:00.148784 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556584-bs22k"] Mar 13 09:44:00 crc kubenswrapper[4841]: I0313 09:44:00.150765 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556584-bs22k" Mar 13 09:44:00 crc kubenswrapper[4841]: I0313 09:44:00.154498 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:44:00 crc kubenswrapper[4841]: I0313 09:44:00.154501 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:44:00 crc kubenswrapper[4841]: I0313 09:44:00.154636 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:44:00 crc kubenswrapper[4841]: I0313 09:44:00.156431 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556584-bs22k"] Mar 13 09:44:00 crc kubenswrapper[4841]: I0313 09:44:00.182433 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dzbp\" (UniqueName: \"kubernetes.io/projected/0f00a74f-0fc0-448b-94db-5715aa58a74f-kube-api-access-6dzbp\") pod \"auto-csr-approver-29556584-bs22k\" (UID: \"0f00a74f-0fc0-448b-94db-5715aa58a74f\") " pod="openshift-infra/auto-csr-approver-29556584-bs22k" Mar 13 09:44:00 crc kubenswrapper[4841]: I0313 09:44:00.284132 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dzbp\" (UniqueName: \"kubernetes.io/projected/0f00a74f-0fc0-448b-94db-5715aa58a74f-kube-api-access-6dzbp\") pod \"auto-csr-approver-29556584-bs22k\" (UID: \"0f00a74f-0fc0-448b-94db-5715aa58a74f\") " pod="openshift-infra/auto-csr-approver-29556584-bs22k" Mar 13 09:44:00 crc kubenswrapper[4841]: I0313 09:44:00.302842 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dzbp\" (UniqueName: \"kubernetes.io/projected/0f00a74f-0fc0-448b-94db-5715aa58a74f-kube-api-access-6dzbp\") pod \"auto-csr-approver-29556584-bs22k\" (UID: \"0f00a74f-0fc0-448b-94db-5715aa58a74f\") " pod="openshift-infra/auto-csr-approver-29556584-bs22k" Mar 13 09:44:00 crc kubenswrapper[4841]: I0313 09:44:00.470980 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556584-bs22k" Mar 13 09:44:00 crc kubenswrapper[4841]: I0313 09:44:00.919560 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556584-bs22k"] Mar 13 09:44:01 crc kubenswrapper[4841]: I0313 09:44:01.034512 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-rc4xv"] Mar 13 09:44:01 crc kubenswrapper[4841]: I0313 09:44:01.046822 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-rc4xv"] Mar 13 09:44:01 crc kubenswrapper[4841]: I0313 09:44:01.587004 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556584-bs22k" event={"ID":"0f00a74f-0fc0-448b-94db-5715aa58a74f","Type":"ContainerStarted","Data":"136c97487431b2c0298a56a8685a6174d8e1d8d28ab8b3afe23d6149bcd4d8df"} Mar 13 09:44:02 crc kubenswrapper[4841]: I0313 09:44:02.025200 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b05d61d-3de0-4314-b5f3-07a447bc3465" path="/var/lib/kubelet/pods/1b05d61d-3de0-4314-b5f3-07a447bc3465/volumes" Mar 13 09:44:02 crc kubenswrapper[4841]: I0313 09:44:02.033833 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xrtxl"] Mar 13 09:44:02 crc kubenswrapper[4841]: I0313 09:44:02.040071 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xrtxl"] Mar 13 09:44:02 crc kubenswrapper[4841]: I0313 09:44:02.768293 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556584-bs22k" event={"ID":"0f00a74f-0fc0-448b-94db-5715aa58a74f","Type":"ContainerDied","Data":"ab118bd4ba3f72842a2c7299e70a4c4adedd239851df247e9ae13d32156e4267"} Mar 13 09:44:02 crc kubenswrapper[4841]: I0313 09:44:02.768257 4841 generic.go:334] "Generic (PLEG): container finished" podID="0f00a74f-0fc0-448b-94db-5715aa58a74f" containerID="ab118bd4ba3f72842a2c7299e70a4c4adedd239851df247e9ae13d32156e4267" exitCode=0 Mar 13 09:44:03 crc kubenswrapper[4841]: I0313 09:44:03.394136 4841 scope.go:117] "RemoveContainer" containerID="1964ccdabfc15bc0c412899a3f0358104247c79e05260c18dba1ff0a5ebf5aed" Mar 13 09:44:03 crc kubenswrapper[4841]: I0313 09:44:03.430381 4841 scope.go:117] "RemoveContainer" containerID="a915ba9cf1f9f73c7697a1203419f37d5428a480b96c6b754065724dc4c5e0d7" Mar 13 09:44:03 crc kubenswrapper[4841]: I0313 09:44:03.476533 4841 scope.go:117] "RemoveContainer" containerID="6f4e65adb70377c7c6401dfcaf04b226125f961b030ced1ca13c1b600a93f210" Mar 13 09:44:03 crc kubenswrapper[4841]: I0313 09:44:03.522823 4841 scope.go:117] "RemoveContainer" containerID="5eda587d3f4512efa865576c43920f269a3499d1015651042bb6fbd1ac0e9406" Mar 13 09:44:03 crc kubenswrapper[4841]: I0313 09:44:03.583423 4841 scope.go:117] "RemoveContainer" containerID="35f357a42ea488cde71740138e6febbe553c221a7357e73b3001c8c27032231e" Mar 13 09:44:03 crc kubenswrapper[4841]: I0313 09:44:03.641039 4841 scope.go:117] "RemoveContainer" containerID="d7633df3a24c174a8b7ec366df6fadbaeddd2340b7c92b61d63d601e06900a9b" Mar 13 09:44:03 crc kubenswrapper[4841]: I0313 09:44:03.675257 4841 scope.go:117] "RemoveContainer" containerID="eed7858a246be159248e8da86ec28e4493239c346e54d6ef68c4c5f664fc5c27" Mar 13 09:44:03 crc kubenswrapper[4841]: I0313 09:44:03.699151 4841 scope.go:117] "RemoveContainer" containerID="9a31840c42b7a0d49723495cb2ed939b71f3ba083a5e80d407235eae8c3edd48" Mar 13 09:44:04 crc kubenswrapper[4841]: I0313 09:44:04.009348 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556584-bs22k" Mar 13 09:44:04 crc kubenswrapper[4841]: I0313 09:44:04.011226 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9f4bb2d-0844-4e66-8bc3-623168e07b9d" path="/var/lib/kubelet/pods/e9f4bb2d-0844-4e66-8bc3-623168e07b9d/volumes" Mar 13 09:44:04 crc kubenswrapper[4841]: I0313 09:44:04.160749 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dzbp\" (UniqueName: \"kubernetes.io/projected/0f00a74f-0fc0-448b-94db-5715aa58a74f-kube-api-access-6dzbp\") pod \"0f00a74f-0fc0-448b-94db-5715aa58a74f\" (UID: \"0f00a74f-0fc0-448b-94db-5715aa58a74f\") " Mar 13 09:44:04 crc kubenswrapper[4841]: I0313 09:44:04.168434 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f00a74f-0fc0-448b-94db-5715aa58a74f-kube-api-access-6dzbp" (OuterVolumeSpecName: "kube-api-access-6dzbp") pod "0f00a74f-0fc0-448b-94db-5715aa58a74f" (UID: "0f00a74f-0fc0-448b-94db-5715aa58a74f"). InnerVolumeSpecName "kube-api-access-6dzbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:44:04 crc kubenswrapper[4841]: I0313 09:44:04.265639 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dzbp\" (UniqueName: \"kubernetes.io/projected/0f00a74f-0fc0-448b-94db-5715aa58a74f-kube-api-access-6dzbp\") on node \"crc\" DevicePath \"\"" Mar 13 09:44:04 crc kubenswrapper[4841]: I0313 09:44:04.807682 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556584-bs22k" event={"ID":"0f00a74f-0fc0-448b-94db-5715aa58a74f","Type":"ContainerDied","Data":"136c97487431b2c0298a56a8685a6174d8e1d8d28ab8b3afe23d6149bcd4d8df"} Mar 13 09:44:04 crc kubenswrapper[4841]: I0313 09:44:04.807746 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="136c97487431b2c0298a56a8685a6174d8e1d8d28ab8b3afe23d6149bcd4d8df" Mar 13 09:44:04 crc kubenswrapper[4841]: I0313 09:44:04.807770 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556584-bs22k" Mar 13 09:44:05 crc kubenswrapper[4841]: I0313 09:44:05.107961 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556578-bwlnc"] Mar 13 09:44:05 crc kubenswrapper[4841]: I0313 09:44:05.123361 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556578-bwlnc"] Mar 13 09:44:06 crc kubenswrapper[4841]: I0313 09:44:06.015841 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6acf170a-aaee-446b-be41-ababb7b80365" path="/var/lib/kubelet/pods/6acf170a-aaee-446b-be41-ababb7b80365/volumes" Mar 13 09:44:34 crc kubenswrapper[4841]: I0313 09:44:34.124000 4841 generic.go:334] "Generic (PLEG): container finished" podID="f3c90f3c-6382-4a13-b4cd-515cfe68538e" containerID="9d74581213b82af9df923a07e95c3fc810ef413ca04f52c0e725f4b96b5e6362" exitCode=0 Mar 13 09:44:34 crc kubenswrapper[4841]: I0313 09:44:34.124075 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p" event={"ID":"f3c90f3c-6382-4a13-b4cd-515cfe68538e","Type":"ContainerDied","Data":"9d74581213b82af9df923a07e95c3fc810ef413ca04f52c0e725f4b96b5e6362"} Mar 13 09:44:35 crc kubenswrapper[4841]: I0313 09:44:35.635197 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p" Mar 13 09:44:35 crc kubenswrapper[4841]: I0313 09:44:35.735098 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3c90f3c-6382-4a13-b4cd-515cfe68538e-inventory\") pod \"f3c90f3c-6382-4a13-b4cd-515cfe68538e\" (UID: \"f3c90f3c-6382-4a13-b4cd-515cfe68538e\") " Mar 13 09:44:35 crc kubenswrapper[4841]: I0313 09:44:35.735171 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgmhl\" (UniqueName: \"kubernetes.io/projected/f3c90f3c-6382-4a13-b4cd-515cfe68538e-kube-api-access-rgmhl\") pod \"f3c90f3c-6382-4a13-b4cd-515cfe68538e\" (UID: \"f3c90f3c-6382-4a13-b4cd-515cfe68538e\") " Mar 13 09:44:35 crc kubenswrapper[4841]: I0313 09:44:35.735204 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3c90f3c-6382-4a13-b4cd-515cfe68538e-ssh-key-openstack-edpm-ipam\") pod \"f3c90f3c-6382-4a13-b4cd-515cfe68538e\" (UID: \"f3c90f3c-6382-4a13-b4cd-515cfe68538e\") " Mar 13 09:44:35 crc kubenswrapper[4841]: I0313 09:44:35.742537 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c90f3c-6382-4a13-b4cd-515cfe68538e-kube-api-access-rgmhl" (OuterVolumeSpecName: "kube-api-access-rgmhl") pod "f3c90f3c-6382-4a13-b4cd-515cfe68538e" (UID: "f3c90f3c-6382-4a13-b4cd-515cfe68538e"). InnerVolumeSpecName "kube-api-access-rgmhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:44:35 crc kubenswrapper[4841]: I0313 09:44:35.771161 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3c90f3c-6382-4a13-b4cd-515cfe68538e-inventory" (OuterVolumeSpecName: "inventory") pod "f3c90f3c-6382-4a13-b4cd-515cfe68538e" (UID: "f3c90f3c-6382-4a13-b4cd-515cfe68538e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:44:35 crc kubenswrapper[4841]: I0313 09:44:35.775832 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3c90f3c-6382-4a13-b4cd-515cfe68538e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f3c90f3c-6382-4a13-b4cd-515cfe68538e" (UID: "f3c90f3c-6382-4a13-b4cd-515cfe68538e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:44:35 crc kubenswrapper[4841]: I0313 09:44:35.839424 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3c90f3c-6382-4a13-b4cd-515cfe68538e-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 09:44:35 crc kubenswrapper[4841]: I0313 09:44:35.839475 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgmhl\" (UniqueName: \"kubernetes.io/projected/f3c90f3c-6382-4a13-b4cd-515cfe68538e-kube-api-access-rgmhl\") on node \"crc\" DevicePath \"\"" Mar 13 09:44:35 crc kubenswrapper[4841]: I0313 09:44:35.839494 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f3c90f3c-6382-4a13-b4cd-515cfe68538e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.147921 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p" event={"ID":"f3c90f3c-6382-4a13-b4cd-515cfe68538e","Type":"ContainerDied","Data":"a5b9e9081c1bcfdf772ad41869a2f1797da357aa6c54fb1bf24bd7702e866acb"} Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.147973 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5b9e9081c1bcfdf772ad41869a2f1797da357aa6c54fb1bf24bd7702e866acb" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.148042 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.237627 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9wxsh"] Mar 13 09:44:36 crc kubenswrapper[4841]: E0313 09:44:36.238139 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c90f3c-6382-4a13-b4cd-515cfe68538e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.238165 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c90f3c-6382-4a13-b4cd-515cfe68538e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 09:44:36 crc kubenswrapper[4841]: E0313 09:44:36.238202 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f00a74f-0fc0-448b-94db-5715aa58a74f" containerName="oc" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.238213 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f00a74f-0fc0-448b-94db-5715aa58a74f" containerName="oc" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.238514 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c90f3c-6382-4a13-b4cd-515cfe68538e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.238537 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f00a74f-0fc0-448b-94db-5715aa58a74f" containerName="oc" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.239437 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9wxsh" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.242603 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.243491 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.243660 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.248137 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rv6jt" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.263161 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9wxsh"] Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.348488 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb824d89-fddc-4746-8b53-a0f3d5e42082-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9wxsh\" (UID: \"eb824d89-fddc-4746-8b53-a0f3d5e42082\") " pod="openstack/ssh-known-hosts-edpm-deployment-9wxsh" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.348534 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/eb824d89-fddc-4746-8b53-a0f3d5e42082-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9wxsh\" (UID: \"eb824d89-fddc-4746-8b53-a0f3d5e42082\") " pod="openstack/ssh-known-hosts-edpm-deployment-9wxsh" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.348621 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrcrc\" (UniqueName: \"kubernetes.io/projected/eb824d89-fddc-4746-8b53-a0f3d5e42082-kube-api-access-xrcrc\") pod \"ssh-known-hosts-edpm-deployment-9wxsh\" (UID: \"eb824d89-fddc-4746-8b53-a0f3d5e42082\") " pod="openstack/ssh-known-hosts-edpm-deployment-9wxsh" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.450494 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb824d89-fddc-4746-8b53-a0f3d5e42082-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9wxsh\" (UID: \"eb824d89-fddc-4746-8b53-a0f3d5e42082\") " pod="openstack/ssh-known-hosts-edpm-deployment-9wxsh" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.450565 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/eb824d89-fddc-4746-8b53-a0f3d5e42082-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9wxsh\" (UID: \"eb824d89-fddc-4746-8b53-a0f3d5e42082\") " pod="openstack/ssh-known-hosts-edpm-deployment-9wxsh" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.450660 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrcrc\" (UniqueName: \"kubernetes.io/projected/eb824d89-fddc-4746-8b53-a0f3d5e42082-kube-api-access-xrcrc\") pod \"ssh-known-hosts-edpm-deployment-9wxsh\" (UID: \"eb824d89-fddc-4746-8b53-a0f3d5e42082\") " pod="openstack/ssh-known-hosts-edpm-deployment-9wxsh" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.456251 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb824d89-fddc-4746-8b53-a0f3d5e42082-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9wxsh\" (UID: \"eb824d89-fddc-4746-8b53-a0f3d5e42082\") " pod="openstack/ssh-known-hosts-edpm-deployment-9wxsh" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.459027 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/eb824d89-fddc-4746-8b53-a0f3d5e42082-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9wxsh\" (UID: \"eb824d89-fddc-4746-8b53-a0f3d5e42082\") " pod="openstack/ssh-known-hosts-edpm-deployment-9wxsh" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.479640 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrcrc\" (UniqueName: \"kubernetes.io/projected/eb824d89-fddc-4746-8b53-a0f3d5e42082-kube-api-access-xrcrc\") pod \"ssh-known-hosts-edpm-deployment-9wxsh\" (UID: \"eb824d89-fddc-4746-8b53-a0f3d5e42082\") " pod="openstack/ssh-known-hosts-edpm-deployment-9wxsh" Mar 13 09:44:36 crc kubenswrapper[4841]: I0313 09:44:36.567478 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9wxsh" Mar 13 09:44:37 crc kubenswrapper[4841]: I0313 09:44:37.116496 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9wxsh"] Mar 13 09:44:37 crc kubenswrapper[4841]: I0313 09:44:37.158197 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9wxsh" event={"ID":"eb824d89-fddc-4746-8b53-a0f3d5e42082","Type":"ContainerStarted","Data":"5d638fc2c9e3b746602f53c1f9096f43fa15e2c7cccf4849a29da25ec63ecfd5"} Mar 13 09:44:38 crc kubenswrapper[4841]: I0313 09:44:38.202947 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9wxsh" event={"ID":"eb824d89-fddc-4746-8b53-a0f3d5e42082","Type":"ContainerStarted","Data":"3178604d6dd49c2e4ab48cd5a15d07251b88dca5bbdbf1d9b37bbe70d85fc691"} Mar 13 09:44:38 crc kubenswrapper[4841]: I0313 09:44:38.233824 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-9wxsh" podStartSLOduration=1.780498531 podStartE2EDuration="2.233799397s" podCreationTimestamp="2026-03-13 09:44:36 +0000 UTC" firstStartedPulling="2026-03-13 09:44:37.133101031 +0000 UTC m=+1959.863001212" lastFinishedPulling="2026-03-13 09:44:37.586401887 +0000 UTC m=+1960.316302078" observedRunningTime="2026-03-13 09:44:38.221702415 +0000 UTC m=+1960.951602626" watchObservedRunningTime="2026-03-13 09:44:38.233799397 +0000 UTC m=+1960.963699578" Mar 13 09:44:44 crc kubenswrapper[4841]: I0313 09:44:44.282941 4841 generic.go:334] "Generic (PLEG): container finished" podID="eb824d89-fddc-4746-8b53-a0f3d5e42082" containerID="3178604d6dd49c2e4ab48cd5a15d07251b88dca5bbdbf1d9b37bbe70d85fc691" exitCode=0 Mar 13 09:44:44 crc kubenswrapper[4841]: I0313 09:44:44.283039 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9wxsh" event={"ID":"eb824d89-fddc-4746-8b53-a0f3d5e42082","Type":"ContainerDied","Data":"3178604d6dd49c2e4ab48cd5a15d07251b88dca5bbdbf1d9b37bbe70d85fc691"} Mar 13 09:44:45 crc kubenswrapper[4841]: I0313 09:44:45.760113 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9wxsh" Mar 13 09:44:45 crc kubenswrapper[4841]: I0313 09:44:45.849398 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb824d89-fddc-4746-8b53-a0f3d5e42082-ssh-key-openstack-edpm-ipam\") pod \"eb824d89-fddc-4746-8b53-a0f3d5e42082\" (UID: \"eb824d89-fddc-4746-8b53-a0f3d5e42082\") " Mar 13 09:44:45 crc kubenswrapper[4841]: I0313 09:44:45.849521 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrcrc\" (UniqueName: \"kubernetes.io/projected/eb824d89-fddc-4746-8b53-a0f3d5e42082-kube-api-access-xrcrc\") pod \"eb824d89-fddc-4746-8b53-a0f3d5e42082\" (UID: \"eb824d89-fddc-4746-8b53-a0f3d5e42082\") " Mar 13 09:44:45 crc kubenswrapper[4841]: I0313 09:44:45.849637 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/eb824d89-fddc-4746-8b53-a0f3d5e42082-inventory-0\") pod \"eb824d89-fddc-4746-8b53-a0f3d5e42082\" (UID: \"eb824d89-fddc-4746-8b53-a0f3d5e42082\") " Mar 13 09:44:45 crc kubenswrapper[4841]: I0313 09:44:45.871336 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb824d89-fddc-4746-8b53-a0f3d5e42082-kube-api-access-xrcrc" (OuterVolumeSpecName: "kube-api-access-xrcrc") pod "eb824d89-fddc-4746-8b53-a0f3d5e42082" (UID: "eb824d89-fddc-4746-8b53-a0f3d5e42082"). InnerVolumeSpecName "kube-api-access-xrcrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:44:45 crc kubenswrapper[4841]: I0313 09:44:45.886717 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb824d89-fddc-4746-8b53-a0f3d5e42082-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "eb824d89-fddc-4746-8b53-a0f3d5e42082" (UID: "eb824d89-fddc-4746-8b53-a0f3d5e42082"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:44:45 crc kubenswrapper[4841]: I0313 09:44:45.886804 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb824d89-fddc-4746-8b53-a0f3d5e42082-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "eb824d89-fddc-4746-8b53-a0f3d5e42082" (UID: "eb824d89-fddc-4746-8b53-a0f3d5e42082"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:44:45 crc kubenswrapper[4841]: I0313 09:44:45.952747 4841 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/eb824d89-fddc-4746-8b53-a0f3d5e42082-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:44:45 crc kubenswrapper[4841]: I0313 09:44:45.952795 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb824d89-fddc-4746-8b53-a0f3d5e42082-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 09:44:45 crc kubenswrapper[4841]: I0313 09:44:45.952818 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrcrc\" (UniqueName: \"kubernetes.io/projected/eb824d89-fddc-4746-8b53-a0f3d5e42082-kube-api-access-xrcrc\") on node \"crc\" DevicePath \"\"" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.042457 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-5rvmg"] Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.051859 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-5rvmg"] Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.306415 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9wxsh" event={"ID":"eb824d89-fddc-4746-8b53-a0f3d5e42082","Type":"ContainerDied","Data":"5d638fc2c9e3b746602f53c1f9096f43fa15e2c7cccf4849a29da25ec63ecfd5"} Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.306703 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d638fc2c9e3b746602f53c1f9096f43fa15e2c7cccf4849a29da25ec63ecfd5" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.306485 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9wxsh" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.390833 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj"] Mar 13 09:44:46 crc kubenswrapper[4841]: E0313 09:44:46.391590 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb824d89-fddc-4746-8b53-a0f3d5e42082" containerName="ssh-known-hosts-edpm-deployment" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.391678 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb824d89-fddc-4746-8b53-a0f3d5e42082" containerName="ssh-known-hosts-edpm-deployment" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.391957 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb824d89-fddc-4746-8b53-a0f3d5e42082" containerName="ssh-known-hosts-edpm-deployment" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.392834 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.395159 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.395312 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.395158 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rv6jt" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.396168 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.400560 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj"] Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.462685 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95faf019-c6d4-4016-87ac-66c7762e56c4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c9rjj\" (UID: \"95faf019-c6d4-4016-87ac-66c7762e56c4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.462766 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95faf019-c6d4-4016-87ac-66c7762e56c4-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c9rjj\" (UID: \"95faf019-c6d4-4016-87ac-66c7762e56c4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.462935 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59dnn\" (UniqueName: \"kubernetes.io/projected/95faf019-c6d4-4016-87ac-66c7762e56c4-kube-api-access-59dnn\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c9rjj\" (UID: \"95faf019-c6d4-4016-87ac-66c7762e56c4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.564973 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59dnn\" (UniqueName: \"kubernetes.io/projected/95faf019-c6d4-4016-87ac-66c7762e56c4-kube-api-access-59dnn\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c9rjj\" (UID: \"95faf019-c6d4-4016-87ac-66c7762e56c4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.565089 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95faf019-c6d4-4016-87ac-66c7762e56c4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c9rjj\" (UID: \"95faf019-c6d4-4016-87ac-66c7762e56c4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.565168 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95faf019-c6d4-4016-87ac-66c7762e56c4-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c9rjj\" (UID: \"95faf019-c6d4-4016-87ac-66c7762e56c4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.569522 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95faf019-c6d4-4016-87ac-66c7762e56c4-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c9rjj\" (UID: \"95faf019-c6d4-4016-87ac-66c7762e56c4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.569686 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95faf019-c6d4-4016-87ac-66c7762e56c4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c9rjj\" (UID: \"95faf019-c6d4-4016-87ac-66c7762e56c4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.580536 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59dnn\" (UniqueName: \"kubernetes.io/projected/95faf019-c6d4-4016-87ac-66c7762e56c4-kube-api-access-59dnn\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c9rjj\" (UID: \"95faf019-c6d4-4016-87ac-66c7762e56c4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj" Mar 13 09:44:46 crc kubenswrapper[4841]: I0313 09:44:46.710767 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj" Mar 13 09:44:47 crc kubenswrapper[4841]: I0313 09:44:47.277651 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj"] Mar 13 09:44:47 crc kubenswrapper[4841]: I0313 09:44:47.314815 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj" event={"ID":"95faf019-c6d4-4016-87ac-66c7762e56c4","Type":"ContainerStarted","Data":"1e1b08f9687815096e16b74b2e7c2f4d3a78907b3566a592524da2ba3797b5a8"} Mar 13 09:44:48 crc kubenswrapper[4841]: I0313 09:44:48.011031 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf39f2a9-0603-43b5-b8e0-8ed87e304c05" path="/var/lib/kubelet/pods/cf39f2a9-0603-43b5-b8e0-8ed87e304c05/volumes" Mar 13 09:44:48 crc kubenswrapper[4841]: I0313 09:44:48.325036 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj" event={"ID":"95faf019-c6d4-4016-87ac-66c7762e56c4","Type":"ContainerStarted","Data":"b9f19b24ab2dfc080061a30d0d821ec21274c6994d59577266f3c0f67aed39a5"} Mar 13 09:44:48 crc kubenswrapper[4841]: I0313 09:44:48.346293 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj" podStartSLOduration=1.910315123 podStartE2EDuration="2.346275787s" podCreationTimestamp="2026-03-13 09:44:46 +0000 UTC" firstStartedPulling="2026-03-13 09:44:47.286300041 +0000 UTC m=+1970.016200232" lastFinishedPulling="2026-03-13 09:44:47.722260665 +0000 UTC m=+1970.452160896" observedRunningTime="2026-03-13 09:44:48.341991385 +0000 UTC m=+1971.071891576" watchObservedRunningTime="2026-03-13 09:44:48.346275787 +0000 UTC m=+1971.076175978" Mar 13 09:44:55 crc kubenswrapper[4841]: I0313 09:44:55.400488 4841 generic.go:334] "Generic (PLEG): container finished" podID="95faf019-c6d4-4016-87ac-66c7762e56c4" containerID="b9f19b24ab2dfc080061a30d0d821ec21274c6994d59577266f3c0f67aed39a5" exitCode=0 Mar 13 09:44:55 crc kubenswrapper[4841]: I0313 09:44:55.400594 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj" event={"ID":"95faf019-c6d4-4016-87ac-66c7762e56c4","Type":"ContainerDied","Data":"b9f19b24ab2dfc080061a30d0d821ec21274c6994d59577266f3c0f67aed39a5"} Mar 13 09:44:56 crc kubenswrapper[4841]: I0313 09:44:56.875206 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj" Mar 13 09:44:56 crc kubenswrapper[4841]: I0313 09:44:56.993873 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59dnn\" (UniqueName: \"kubernetes.io/projected/95faf019-c6d4-4016-87ac-66c7762e56c4-kube-api-access-59dnn\") pod \"95faf019-c6d4-4016-87ac-66c7762e56c4\" (UID: \"95faf019-c6d4-4016-87ac-66c7762e56c4\") " Mar 13 09:44:56 crc kubenswrapper[4841]: I0313 09:44:56.993941 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95faf019-c6d4-4016-87ac-66c7762e56c4-ssh-key-openstack-edpm-ipam\") pod \"95faf019-c6d4-4016-87ac-66c7762e56c4\" (UID: \"95faf019-c6d4-4016-87ac-66c7762e56c4\") " Mar 13 09:44:56 crc kubenswrapper[4841]: I0313 09:44:56.994074 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95faf019-c6d4-4016-87ac-66c7762e56c4-inventory\") pod \"95faf019-c6d4-4016-87ac-66c7762e56c4\" (UID: \"95faf019-c6d4-4016-87ac-66c7762e56c4\") " Mar 13 09:44:56 crc kubenswrapper[4841]: I0313 09:44:56.999649 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95faf019-c6d4-4016-87ac-66c7762e56c4-kube-api-access-59dnn" (OuterVolumeSpecName: "kube-api-access-59dnn") pod "95faf019-c6d4-4016-87ac-66c7762e56c4" (UID: "95faf019-c6d4-4016-87ac-66c7762e56c4"). InnerVolumeSpecName "kube-api-access-59dnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.021720 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95faf019-c6d4-4016-87ac-66c7762e56c4-inventory" (OuterVolumeSpecName: "inventory") pod "95faf019-c6d4-4016-87ac-66c7762e56c4" (UID: "95faf019-c6d4-4016-87ac-66c7762e56c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.030885 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95faf019-c6d4-4016-87ac-66c7762e56c4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "95faf019-c6d4-4016-87ac-66c7762e56c4" (UID: "95faf019-c6d4-4016-87ac-66c7762e56c4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.098029 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95faf019-c6d4-4016-87ac-66c7762e56c4-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.098065 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95faf019-c6d4-4016-87ac-66c7762e56c4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.098077 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59dnn\" (UniqueName: \"kubernetes.io/projected/95faf019-c6d4-4016-87ac-66c7762e56c4-kube-api-access-59dnn\") on node \"crc\" DevicePath \"\"" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.422639 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj" event={"ID":"95faf019-c6d4-4016-87ac-66c7762e56c4","Type":"ContainerDied","Data":"1e1b08f9687815096e16b74b2e7c2f4d3a78907b3566a592524da2ba3797b5a8"} Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.422681 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e1b08f9687815096e16b74b2e7c2f4d3a78907b3566a592524da2ba3797b5a8" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.422741 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c9rjj" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.499824 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn"] Mar 13 09:44:57 crc kubenswrapper[4841]: E0313 09:44:57.500319 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95faf019-c6d4-4016-87ac-66c7762e56c4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.500340 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="95faf019-c6d4-4016-87ac-66c7762e56c4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.500597 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="95faf019-c6d4-4016-87ac-66c7762e56c4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.501312 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.505398 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rv6jt" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.505583 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.505668 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.510693 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.519895 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn"] Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.608118 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23e2fc94-fce0-4eeb-8a78-15e934c02371-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn\" (UID: \"23e2fc94-fce0-4eeb-8a78-15e934c02371\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.608257 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s6td\" (UniqueName: \"kubernetes.io/projected/23e2fc94-fce0-4eeb-8a78-15e934c02371-kube-api-access-9s6td\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn\" (UID: \"23e2fc94-fce0-4eeb-8a78-15e934c02371\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.608346 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23e2fc94-fce0-4eeb-8a78-15e934c02371-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn\" (UID: \"23e2fc94-fce0-4eeb-8a78-15e934c02371\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.710801 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s6td\" (UniqueName: \"kubernetes.io/projected/23e2fc94-fce0-4eeb-8a78-15e934c02371-kube-api-access-9s6td\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn\" (UID: \"23e2fc94-fce0-4eeb-8a78-15e934c02371\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.710895 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23e2fc94-fce0-4eeb-8a78-15e934c02371-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn\" (UID: \"23e2fc94-fce0-4eeb-8a78-15e934c02371\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.711019 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23e2fc94-fce0-4eeb-8a78-15e934c02371-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn\" (UID: \"23e2fc94-fce0-4eeb-8a78-15e934c02371\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.716046 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23e2fc94-fce0-4eeb-8a78-15e934c02371-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn\" (UID: \"23e2fc94-fce0-4eeb-8a78-15e934c02371\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.716648 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23e2fc94-fce0-4eeb-8a78-15e934c02371-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn\" (UID: \"23e2fc94-fce0-4eeb-8a78-15e934c02371\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.731068 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s6td\" (UniqueName: \"kubernetes.io/projected/23e2fc94-fce0-4eeb-8a78-15e934c02371-kube-api-access-9s6td\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn\" (UID: \"23e2fc94-fce0-4eeb-8a78-15e934c02371\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn" Mar 13 09:44:57 crc kubenswrapper[4841]: I0313 09:44:57.871226 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn" Mar 13 09:44:58 crc kubenswrapper[4841]: I0313 09:44:58.363755 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn"] Mar 13 09:44:58 crc kubenswrapper[4841]: I0313 09:44:58.430947 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn" event={"ID":"23e2fc94-fce0-4eeb-8a78-15e934c02371","Type":"ContainerStarted","Data":"b657d93aedbab242c872495aedb87ddd8b4dd61ea5f88059def4fb00ad108a9f"} Mar 13 09:44:58 crc kubenswrapper[4841]: I0313 09:44:58.825740 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 09:44:59 crc kubenswrapper[4841]: I0313 09:44:59.441721 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn" event={"ID":"23e2fc94-fce0-4eeb-8a78-15e934c02371","Type":"ContainerStarted","Data":"37fe01a534b341da4b404f1cbf98f87911e4a27b043fee8e524e21b4e1f01eec"} Mar 13 09:44:59 crc kubenswrapper[4841]: I0313 09:44:59.466675 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn" podStartSLOduration=2.009468851 podStartE2EDuration="2.466657605s" podCreationTimestamp="2026-03-13 09:44:57 +0000 UTC" firstStartedPulling="2026-03-13 09:44:58.36595495 +0000 UTC m=+1981.095855141" lastFinishedPulling="2026-03-13 09:44:58.823143704 +0000 UTC m=+1981.553043895" observedRunningTime="2026-03-13 09:44:59.462769415 +0000 UTC m=+1982.192669636" watchObservedRunningTime="2026-03-13 09:44:59.466657605 +0000 UTC m=+1982.196557806" Mar 13 09:45:00 crc kubenswrapper[4841]: I0313 09:45:00.141942 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th"] Mar 13 09:45:00 crc kubenswrapper[4841]: I0313 09:45:00.143591 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th" Mar 13 09:45:00 crc kubenswrapper[4841]: I0313 09:45:00.146460 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 09:45:00 crc kubenswrapper[4841]: I0313 09:45:00.146485 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 09:45:00 crc kubenswrapper[4841]: I0313 09:45:00.158091 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th"] Mar 13 09:45:00 crc kubenswrapper[4841]: I0313 09:45:00.281902 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br2c9\" (UniqueName: \"kubernetes.io/projected/68684695-4cc8-4cf4-8c9c-ef7502600c1e-kube-api-access-br2c9\") pod \"collect-profiles-29556585-b64th\" (UID: \"68684695-4cc8-4cf4-8c9c-ef7502600c1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th" Mar 13 09:45:00 crc kubenswrapper[4841]: I0313 09:45:00.281981 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68684695-4cc8-4cf4-8c9c-ef7502600c1e-secret-volume\") pod \"collect-profiles-29556585-b64th\" (UID: \"68684695-4cc8-4cf4-8c9c-ef7502600c1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th" Mar 13 09:45:00 crc kubenswrapper[4841]: I0313 09:45:00.282070 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68684695-4cc8-4cf4-8c9c-ef7502600c1e-config-volume\") pod \"collect-profiles-29556585-b64th\" (UID: \"68684695-4cc8-4cf4-8c9c-ef7502600c1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th" Mar 13 09:45:00 crc kubenswrapper[4841]: I0313 09:45:00.383367 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br2c9\" (UniqueName: \"kubernetes.io/projected/68684695-4cc8-4cf4-8c9c-ef7502600c1e-kube-api-access-br2c9\") pod \"collect-profiles-29556585-b64th\" (UID: \"68684695-4cc8-4cf4-8c9c-ef7502600c1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th" Mar 13 09:45:00 crc kubenswrapper[4841]: I0313 09:45:00.383795 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68684695-4cc8-4cf4-8c9c-ef7502600c1e-secret-volume\") pod \"collect-profiles-29556585-b64th\" (UID: \"68684695-4cc8-4cf4-8c9c-ef7502600c1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th" Mar 13 09:45:00 crc kubenswrapper[4841]: I0313 09:45:00.383891 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68684695-4cc8-4cf4-8c9c-ef7502600c1e-config-volume\") pod \"collect-profiles-29556585-b64th\" (UID: \"68684695-4cc8-4cf4-8c9c-ef7502600c1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th" Mar 13 09:45:00 crc kubenswrapper[4841]: I0313 09:45:00.384949 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68684695-4cc8-4cf4-8c9c-ef7502600c1e-config-volume\") pod \"collect-profiles-29556585-b64th\" (UID: \"68684695-4cc8-4cf4-8c9c-ef7502600c1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th" Mar 13 09:45:00 crc kubenswrapper[4841]: I0313 09:45:00.389658 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68684695-4cc8-4cf4-8c9c-ef7502600c1e-secret-volume\") pod \"collect-profiles-29556585-b64th\" (UID: \"68684695-4cc8-4cf4-8c9c-ef7502600c1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th" Mar 13 09:45:00 crc kubenswrapper[4841]: I0313 09:45:00.409618 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br2c9\" (UniqueName: \"kubernetes.io/projected/68684695-4cc8-4cf4-8c9c-ef7502600c1e-kube-api-access-br2c9\") pod \"collect-profiles-29556585-b64th\" (UID: \"68684695-4cc8-4cf4-8c9c-ef7502600c1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th" Mar 13 09:45:00 crc kubenswrapper[4841]: I0313 09:45:00.470605 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th" Mar 13 09:45:00 crc kubenswrapper[4841]: I0313 09:45:00.919729 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th"] Mar 13 09:45:01 crc kubenswrapper[4841]: I0313 09:45:01.461146 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th" event={"ID":"68684695-4cc8-4cf4-8c9c-ef7502600c1e","Type":"ContainerStarted","Data":"3a791bc73d694965e9885725672be0c75b9baaf7a7e3487ca1bb13ae33508f68"} Mar 13 09:45:01 crc kubenswrapper[4841]: I0313 09:45:01.461511 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th" event={"ID":"68684695-4cc8-4cf4-8c9c-ef7502600c1e","Type":"ContainerStarted","Data":"4655e2022b1b668c767c86cf482f05063b90ab2fe700651a888a02676308eb6d"} Mar 13 09:45:01 crc kubenswrapper[4841]: I0313 09:45:01.486448 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th" podStartSLOduration=1.486403962 podStartE2EDuration="1.486403962s" podCreationTimestamp="2026-03-13 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:45:01.477917581 +0000 UTC m=+1984.207817802" watchObservedRunningTime="2026-03-13 09:45:01.486403962 +0000 UTC m=+1984.216304163" Mar 13 09:45:02 crc kubenswrapper[4841]: I0313 09:45:02.473372 4841 generic.go:334] "Generic (PLEG): container finished" podID="68684695-4cc8-4cf4-8c9c-ef7502600c1e" containerID="3a791bc73d694965e9885725672be0c75b9baaf7a7e3487ca1bb13ae33508f68" exitCode=0 Mar 13 09:45:02 crc kubenswrapper[4841]: I0313 09:45:02.473424 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th" event={"ID":"68684695-4cc8-4cf4-8c9c-ef7502600c1e","Type":"ContainerDied","Data":"3a791bc73d694965e9885725672be0c75b9baaf7a7e3487ca1bb13ae33508f68"} Mar 13 09:45:03 crc kubenswrapper[4841]: I0313 09:45:03.827884 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th" Mar 13 09:45:03 crc kubenswrapper[4841]: I0313 09:45:03.899533 4841 scope.go:117] "RemoveContainer" containerID="9b41b1fcadfa46b35e1f77a3f8d704fafeefa20d6ade44ee2ada70391a791ab7" Mar 13 09:45:03 crc kubenswrapper[4841]: I0313 09:45:03.941150 4841 scope.go:117] "RemoveContainer" containerID="fa082b44c04a30b03f11db32a7fe268cae8b649f8398efae03159a38e74fbbc2" Mar 13 09:45:03 crc kubenswrapper[4841]: I0313 09:45:03.956184 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68684695-4cc8-4cf4-8c9c-ef7502600c1e-config-volume\") pod \"68684695-4cc8-4cf4-8c9c-ef7502600c1e\" (UID: \"68684695-4cc8-4cf4-8c9c-ef7502600c1e\") " Mar 13 09:45:03 crc kubenswrapper[4841]: I0313 09:45:03.956414 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68684695-4cc8-4cf4-8c9c-ef7502600c1e-secret-volume\") pod \"68684695-4cc8-4cf4-8c9c-ef7502600c1e\" (UID: \"68684695-4cc8-4cf4-8c9c-ef7502600c1e\") " Mar 13 09:45:03 crc kubenswrapper[4841]: I0313 09:45:03.956492 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br2c9\" (UniqueName: \"kubernetes.io/projected/68684695-4cc8-4cf4-8c9c-ef7502600c1e-kube-api-access-br2c9\") pod \"68684695-4cc8-4cf4-8c9c-ef7502600c1e\" (UID: \"68684695-4cc8-4cf4-8c9c-ef7502600c1e\") " Mar 13 09:45:03 crc kubenswrapper[4841]: I0313 09:45:03.957079 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68684695-4cc8-4cf4-8c9c-ef7502600c1e-config-volume" (OuterVolumeSpecName: "config-volume") pod "68684695-4cc8-4cf4-8c9c-ef7502600c1e" (UID: "68684695-4cc8-4cf4-8c9c-ef7502600c1e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:45:03 crc kubenswrapper[4841]: I0313 09:45:03.958329 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68684695-4cc8-4cf4-8c9c-ef7502600c1e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:03 crc kubenswrapper[4841]: I0313 09:45:03.963172 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68684695-4cc8-4cf4-8c9c-ef7502600c1e-kube-api-access-br2c9" (OuterVolumeSpecName: "kube-api-access-br2c9") pod "68684695-4cc8-4cf4-8c9c-ef7502600c1e" (UID: "68684695-4cc8-4cf4-8c9c-ef7502600c1e"). InnerVolumeSpecName "kube-api-access-br2c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:45:03 crc kubenswrapper[4841]: I0313 09:45:03.963229 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68684695-4cc8-4cf4-8c9c-ef7502600c1e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "68684695-4cc8-4cf4-8c9c-ef7502600c1e" (UID: "68684695-4cc8-4cf4-8c9c-ef7502600c1e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:45:03 crc kubenswrapper[4841]: I0313 09:45:03.988686 4841 scope.go:117] "RemoveContainer" containerID="7147364091d2179cc7eb600f6bcff6ea373aaab950687515a0dce0a7af9d9277" Mar 13 09:45:04 crc kubenswrapper[4841]: I0313 09:45:04.060043 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68684695-4cc8-4cf4-8c9c-ef7502600c1e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:04 crc kubenswrapper[4841]: I0313 09:45:04.060077 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br2c9\" (UniqueName: \"kubernetes.io/projected/68684695-4cc8-4cf4-8c9c-ef7502600c1e-kube-api-access-br2c9\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:04 crc kubenswrapper[4841]: I0313 09:45:04.503775 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th" event={"ID":"68684695-4cc8-4cf4-8c9c-ef7502600c1e","Type":"ContainerDied","Data":"4655e2022b1b668c767c86cf482f05063b90ab2fe700651a888a02676308eb6d"} Mar 13 09:45:04 crc kubenswrapper[4841]: I0313 09:45:04.504141 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4655e2022b1b668c767c86cf482f05063b90ab2fe700651a888a02676308eb6d" Mar 13 09:45:04 crc kubenswrapper[4841]: I0313 09:45:04.503834 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th" Mar 13 09:45:07 crc kubenswrapper[4841]: I0313 09:45:07.551686 4841 generic.go:334] "Generic (PLEG): container finished" podID="23e2fc94-fce0-4eeb-8a78-15e934c02371" containerID="37fe01a534b341da4b404f1cbf98f87911e4a27b043fee8e524e21b4e1f01eec" exitCode=0 Mar 13 09:45:07 crc kubenswrapper[4841]: I0313 09:45:07.551791 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn" event={"ID":"23e2fc94-fce0-4eeb-8a78-15e934c02371","Type":"ContainerDied","Data":"37fe01a534b341da4b404f1cbf98f87911e4a27b043fee8e524e21b4e1f01eec"} Mar 13 09:45:08 crc kubenswrapper[4841]: I0313 09:45:08.989921 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.173921 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s6td\" (UniqueName: \"kubernetes.io/projected/23e2fc94-fce0-4eeb-8a78-15e934c02371-kube-api-access-9s6td\") pod \"23e2fc94-fce0-4eeb-8a78-15e934c02371\" (UID: \"23e2fc94-fce0-4eeb-8a78-15e934c02371\") " Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.174080 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23e2fc94-fce0-4eeb-8a78-15e934c02371-ssh-key-openstack-edpm-ipam\") pod \"23e2fc94-fce0-4eeb-8a78-15e934c02371\" (UID: \"23e2fc94-fce0-4eeb-8a78-15e934c02371\") " Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.174246 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23e2fc94-fce0-4eeb-8a78-15e934c02371-inventory\") pod \"23e2fc94-fce0-4eeb-8a78-15e934c02371\" (UID: \"23e2fc94-fce0-4eeb-8a78-15e934c02371\") " Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.180497 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23e2fc94-fce0-4eeb-8a78-15e934c02371-kube-api-access-9s6td" (OuterVolumeSpecName: "kube-api-access-9s6td") pod "23e2fc94-fce0-4eeb-8a78-15e934c02371" (UID: "23e2fc94-fce0-4eeb-8a78-15e934c02371"). InnerVolumeSpecName "kube-api-access-9s6td". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.200378 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e2fc94-fce0-4eeb-8a78-15e934c02371-inventory" (OuterVolumeSpecName: "inventory") pod "23e2fc94-fce0-4eeb-8a78-15e934c02371" (UID: "23e2fc94-fce0-4eeb-8a78-15e934c02371"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.206131 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e2fc94-fce0-4eeb-8a78-15e934c02371-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "23e2fc94-fce0-4eeb-8a78-15e934c02371" (UID: "23e2fc94-fce0-4eeb-8a78-15e934c02371"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.277471 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23e2fc94-fce0-4eeb-8a78-15e934c02371-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.277535 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23e2fc94-fce0-4eeb-8a78-15e934c02371-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.277554 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s6td\" (UniqueName: \"kubernetes.io/projected/23e2fc94-fce0-4eeb-8a78-15e934c02371-kube-api-access-9s6td\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.569606 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn" event={"ID":"23e2fc94-fce0-4eeb-8a78-15e934c02371","Type":"ContainerDied","Data":"b657d93aedbab242c872495aedb87ddd8b4dd61ea5f88059def4fb00ad108a9f"} Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.569644 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b657d93aedbab242c872495aedb87ddd8b4dd61ea5f88059def4fb00ad108a9f" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.569655 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.672582 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv"] Mar 13 09:45:09 crc kubenswrapper[4841]: E0313 09:45:09.673005 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e2fc94-fce0-4eeb-8a78-15e934c02371" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.673027 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e2fc94-fce0-4eeb-8a78-15e934c02371" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 09:45:09 crc kubenswrapper[4841]: E0313 09:45:09.673046 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68684695-4cc8-4cf4-8c9c-ef7502600c1e" containerName="collect-profiles" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.673055 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="68684695-4cc8-4cf4-8c9c-ef7502600c1e" containerName="collect-profiles" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.673314 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="68684695-4cc8-4cf4-8c9c-ef7502600c1e" containerName="collect-profiles" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.673341 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e2fc94-fce0-4eeb-8a78-15e934c02371" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.674099 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.677736 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.677775 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.678117 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.678346 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.678452 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.678645 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.678941 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.685145 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rv6jt" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.687571 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.687642 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.687698 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.687804 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.687850 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.688012 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.688110 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.688169 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wch8n\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-kube-api-access-wch8n\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.688252 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.688409 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.688473 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.688514 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.688602 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.688662 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.698591 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv"] Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.790776 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.790831 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.790874 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.790972 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.791024 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.791111 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.791153 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.791188 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wch8n\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-kube-api-access-wch8n\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.791257 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.791323 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.791350 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.791415 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.791450 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.791478 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.795336 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.795513 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.795795 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.796096 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.796661 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.796900 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.799239 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.799350 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.800134 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.806051 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.806744 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.807323 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.807466 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:09 crc kubenswrapper[4841]: I0313 09:45:09.816405 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wch8n\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-kube-api-access-wch8n\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:10 crc kubenswrapper[4841]: I0313 09:45:10.020510 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:10 crc kubenswrapper[4841]: W0313 09:45:10.343652 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9904117c_604f_48b6_9f6b_ef60210b0a94.slice/crio-e934a58b873df32945ccecd66d19d91fe34b2c78da77b4e900b30314ca57fb8e WatchSource:0}: Error finding container e934a58b873df32945ccecd66d19d91fe34b2c78da77b4e900b30314ca57fb8e: Status 404 returned error can't find the container with id e934a58b873df32945ccecd66d19d91fe34b2c78da77b4e900b30314ca57fb8e Mar 13 09:45:10 crc kubenswrapper[4841]: I0313 09:45:10.346011 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv"] Mar 13 09:45:10 crc kubenswrapper[4841]: I0313 09:45:10.580558 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" event={"ID":"9904117c-604f-48b6-9f6b-ef60210b0a94","Type":"ContainerStarted","Data":"e934a58b873df32945ccecd66d19d91fe34b2c78da77b4e900b30314ca57fb8e"} Mar 13 09:45:11 crc kubenswrapper[4841]: I0313 09:45:11.604197 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" event={"ID":"9904117c-604f-48b6-9f6b-ef60210b0a94","Type":"ContainerStarted","Data":"77e4797fbae4e8f805abb14ab111f09c8ab28eb8962edbf9d76af53f5822f98f"} Mar 13 09:45:11 crc kubenswrapper[4841]: I0313 09:45:11.632163 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" podStartSLOduration=2.042876185 podStartE2EDuration="2.632142532s" podCreationTimestamp="2026-03-13 09:45:09 +0000 UTC" firstStartedPulling="2026-03-13 09:45:10.346378659 +0000 UTC m=+1993.076278840" lastFinishedPulling="2026-03-13 09:45:10.935644976 +0000 UTC m=+1993.665545187" observedRunningTime="2026-03-13 09:45:11.624731105 +0000 UTC m=+1994.354631306" watchObservedRunningTime="2026-03-13 09:45:11.632142532 +0000 UTC m=+1994.362042733" Mar 13 09:45:44 crc kubenswrapper[4841]: E0313 09:45:44.924865 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9904117c_604f_48b6_9f6b_ef60210b0a94.slice/crio-conmon-77e4797fbae4e8f805abb14ab111f09c8ab28eb8962edbf9d76af53f5822f98f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9904117c_604f_48b6_9f6b_ef60210b0a94.slice/crio-77e4797fbae4e8f805abb14ab111f09c8ab28eb8962edbf9d76af53f5822f98f.scope\": RecentStats: unable to find data in memory cache]" Mar 13 09:45:44 crc kubenswrapper[4841]: I0313 09:45:44.969767 4841 generic.go:334] "Generic (PLEG): container finished" podID="9904117c-604f-48b6-9f6b-ef60210b0a94" containerID="77e4797fbae4e8f805abb14ab111f09c8ab28eb8962edbf9d76af53f5822f98f" exitCode=0 Mar 13 09:45:44 crc kubenswrapper[4841]: I0313 09:45:44.969820 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" event={"ID":"9904117c-604f-48b6-9f6b-ef60210b0a94","Type":"ContainerDied","Data":"77e4797fbae4e8f805abb14ab111f09c8ab28eb8962edbf9d76af53f5822f98f"} Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.422660 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.503935 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-ssh-key-openstack-edpm-ipam\") pod \"9904117c-604f-48b6-9f6b-ef60210b0a94\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.504010 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"9904117c-604f-48b6-9f6b-ef60210b0a94\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.504063 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-repo-setup-combined-ca-bundle\") pod \"9904117c-604f-48b6-9f6b-ef60210b0a94\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.504111 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"9904117c-604f-48b6-9f6b-ef60210b0a94\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.504160 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wch8n\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-kube-api-access-wch8n\") pod \"9904117c-604f-48b6-9f6b-ef60210b0a94\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.504218 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-ovn-combined-ca-bundle\") pod \"9904117c-604f-48b6-9f6b-ef60210b0a94\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.504255 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-bootstrap-combined-ca-bundle\") pod \"9904117c-604f-48b6-9f6b-ef60210b0a94\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.504306 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-neutron-metadata-combined-ca-bundle\") pod \"9904117c-604f-48b6-9f6b-ef60210b0a94\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.504335 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"9904117c-604f-48b6-9f6b-ef60210b0a94\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.504423 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-nova-combined-ca-bundle\") pod \"9904117c-604f-48b6-9f6b-ef60210b0a94\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.504456 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-telemetry-combined-ca-bundle\") pod \"9904117c-604f-48b6-9f6b-ef60210b0a94\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.504592 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-libvirt-combined-ca-bundle\") pod \"9904117c-604f-48b6-9f6b-ef60210b0a94\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.504651 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-ovn-default-certs-0\") pod \"9904117c-604f-48b6-9f6b-ef60210b0a94\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.504729 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-inventory\") pod \"9904117c-604f-48b6-9f6b-ef60210b0a94\" (UID: \"9904117c-604f-48b6-9f6b-ef60210b0a94\") " Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.512488 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "9904117c-604f-48b6-9f6b-ef60210b0a94" (UID: "9904117c-604f-48b6-9f6b-ef60210b0a94"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.512557 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "9904117c-604f-48b6-9f6b-ef60210b0a94" (UID: "9904117c-604f-48b6-9f6b-ef60210b0a94"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.512555 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "9904117c-604f-48b6-9f6b-ef60210b0a94" (UID: "9904117c-604f-48b6-9f6b-ef60210b0a94"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.514834 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9904117c-604f-48b6-9f6b-ef60210b0a94" (UID: "9904117c-604f-48b6-9f6b-ef60210b0a94"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.514849 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "9904117c-604f-48b6-9f6b-ef60210b0a94" (UID: "9904117c-604f-48b6-9f6b-ef60210b0a94"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.515120 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9904117c-604f-48b6-9f6b-ef60210b0a94" (UID: "9904117c-604f-48b6-9f6b-ef60210b0a94"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.515697 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9904117c-604f-48b6-9f6b-ef60210b0a94" (UID: "9904117c-604f-48b6-9f6b-ef60210b0a94"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.516615 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9904117c-604f-48b6-9f6b-ef60210b0a94" (UID: "9904117c-604f-48b6-9f6b-ef60210b0a94"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.517580 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "9904117c-604f-48b6-9f6b-ef60210b0a94" (UID: "9904117c-604f-48b6-9f6b-ef60210b0a94"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.517573 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9904117c-604f-48b6-9f6b-ef60210b0a94" (UID: "9904117c-604f-48b6-9f6b-ef60210b0a94"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.517699 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-kube-api-access-wch8n" (OuterVolumeSpecName: "kube-api-access-wch8n") pod "9904117c-604f-48b6-9f6b-ef60210b0a94" (UID: "9904117c-604f-48b6-9f6b-ef60210b0a94"). InnerVolumeSpecName "kube-api-access-wch8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.518027 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9904117c-604f-48b6-9f6b-ef60210b0a94" (UID: "9904117c-604f-48b6-9f6b-ef60210b0a94"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.537453 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9904117c-604f-48b6-9f6b-ef60210b0a94" (UID: "9904117c-604f-48b6-9f6b-ef60210b0a94"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.538900 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-inventory" (OuterVolumeSpecName: "inventory") pod "9904117c-604f-48b6-9f6b-ef60210b0a94" (UID: "9904117c-604f-48b6-9f6b-ef60210b0a94"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.607488 4841 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.607533 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.607553 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.607567 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.607581 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.607594 4841 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.607607 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.607620 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wch8n\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-kube-api-access-wch8n\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.607633 4841 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.607645 4841 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.607659 4841 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.607672 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9904117c-604f-48b6-9f6b-ef60210b0a94-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.607685 4841 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.607699 4841 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904117c-604f-48b6-9f6b-ef60210b0a94-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.986331 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" event={"ID":"9904117c-604f-48b6-9f6b-ef60210b0a94","Type":"ContainerDied","Data":"e934a58b873df32945ccecd66d19d91fe34b2c78da77b4e900b30314ca57fb8e"} Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.986602 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e934a58b873df32945ccecd66d19d91fe34b2c78da77b4e900b30314ca57fb8e" Mar 13 09:45:46 crc kubenswrapper[4841]: I0313 09:45:46.986427 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.100933 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q"] Mar 13 09:45:47 crc kubenswrapper[4841]: E0313 09:45:47.101618 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9904117c-604f-48b6-9f6b-ef60210b0a94" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.101642 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9904117c-604f-48b6-9f6b-ef60210b0a94" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.101878 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9904117c-604f-48b6-9f6b-ef60210b0a94" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.102640 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.104601 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.105010 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.105489 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.105754 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rv6jt" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.106108 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.117150 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q"] Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.218516 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de039f4c-0550-4464-b901-a624fac40281-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-59f4q\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.218683 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de039f4c-0550-4464-b901-a624fac40281-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-59f4q\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.218712 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de039f4c-0550-4464-b901-a624fac40281-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-59f4q\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.218738 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de039f4c-0550-4464-b901-a624fac40281-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-59f4q\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.218855 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtx5v\" (UniqueName: \"kubernetes.io/projected/de039f4c-0550-4464-b901-a624fac40281-kube-api-access-mtx5v\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-59f4q\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.320093 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de039f4c-0550-4464-b901-a624fac40281-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-59f4q\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.320152 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de039f4c-0550-4464-b901-a624fac40281-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-59f4q\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.320183 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de039f4c-0550-4464-b901-a624fac40281-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-59f4q\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.320250 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtx5v\" (UniqueName: \"kubernetes.io/projected/de039f4c-0550-4464-b901-a624fac40281-kube-api-access-mtx5v\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-59f4q\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.320380 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de039f4c-0550-4464-b901-a624fac40281-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-59f4q\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.321185 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de039f4c-0550-4464-b901-a624fac40281-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-59f4q\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.325327 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de039f4c-0550-4464-b901-a624fac40281-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-59f4q\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.325438 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de039f4c-0550-4464-b901-a624fac40281-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-59f4q\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.325455 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de039f4c-0550-4464-b901-a624fac40281-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-59f4q\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.337327 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtx5v\" (UniqueName: \"kubernetes.io/projected/de039f4c-0550-4464-b901-a624fac40281-kube-api-access-mtx5v\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-59f4q\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.421368 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:45:47 crc kubenswrapper[4841]: I0313 09:45:47.925402 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q"] Mar 13 09:45:48 crc kubenswrapper[4841]: I0313 09:45:48.007339 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" event={"ID":"de039f4c-0550-4464-b901-a624fac40281","Type":"ContainerStarted","Data":"a15b3d33df5098bcbe342b92609cffb919a622a1c0fad3745e37fa65ae128ce0"} Mar 13 09:45:49 crc kubenswrapper[4841]: I0313 09:45:49.014724 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" event={"ID":"de039f4c-0550-4464-b901-a624fac40281","Type":"ContainerStarted","Data":"4ecbaa8d04c8264a37db750e83091deb21a25e7027cf1b650cffaf2010d14562"} Mar 13 09:45:49 crc kubenswrapper[4841]: I0313 09:45:49.033851 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" podStartSLOduration=1.619739931 podStartE2EDuration="2.033834103s" podCreationTimestamp="2026-03-13 09:45:47 +0000 UTC" firstStartedPulling="2026-03-13 09:45:47.925490464 +0000 UTC m=+2030.655390655" lastFinishedPulling="2026-03-13 09:45:48.339584626 +0000 UTC m=+2031.069484827" observedRunningTime="2026-03-13 09:45:49.03342804 +0000 UTC m=+2031.763328241" watchObservedRunningTime="2026-03-13 09:45:49.033834103 +0000 UTC m=+2031.763734294" Mar 13 09:46:00 crc kubenswrapper[4841]: I0313 09:46:00.135583 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556586-rdpdd"] Mar 13 09:46:00 crc kubenswrapper[4841]: I0313 09:46:00.137730 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556586-rdpdd" Mar 13 09:46:00 crc kubenswrapper[4841]: I0313 09:46:00.141456 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:46:00 crc kubenswrapper[4841]: I0313 09:46:00.141481 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:46:00 crc kubenswrapper[4841]: I0313 09:46:00.141521 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:46:00 crc kubenswrapper[4841]: I0313 09:46:00.145640 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556586-rdpdd"] Mar 13 09:46:00 crc kubenswrapper[4841]: I0313 09:46:00.178328 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk8r5\" (UniqueName: \"kubernetes.io/projected/b34fcbdd-3170-44e1-a881-867750de01ec-kube-api-access-kk8r5\") pod \"auto-csr-approver-29556586-rdpdd\" (UID: \"b34fcbdd-3170-44e1-a881-867750de01ec\") " pod="openshift-infra/auto-csr-approver-29556586-rdpdd" Mar 13 09:46:00 crc kubenswrapper[4841]: I0313 09:46:00.280992 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk8r5\" (UniqueName: \"kubernetes.io/projected/b34fcbdd-3170-44e1-a881-867750de01ec-kube-api-access-kk8r5\") pod \"auto-csr-approver-29556586-rdpdd\" (UID: \"b34fcbdd-3170-44e1-a881-867750de01ec\") " pod="openshift-infra/auto-csr-approver-29556586-rdpdd" Mar 13 09:46:00 crc kubenswrapper[4841]: I0313 09:46:00.302108 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk8r5\" (UniqueName: \"kubernetes.io/projected/b34fcbdd-3170-44e1-a881-867750de01ec-kube-api-access-kk8r5\") pod \"auto-csr-approver-29556586-rdpdd\" (UID: \"b34fcbdd-3170-44e1-a881-867750de01ec\") " pod="openshift-infra/auto-csr-approver-29556586-rdpdd" Mar 13 09:46:00 crc kubenswrapper[4841]: I0313 09:46:00.464077 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556586-rdpdd" Mar 13 09:46:00 crc kubenswrapper[4841]: I0313 09:46:00.969368 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556586-rdpdd"] Mar 13 09:46:01 crc kubenswrapper[4841]: I0313 09:46:01.116047 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556586-rdpdd" event={"ID":"b34fcbdd-3170-44e1-a881-867750de01ec","Type":"ContainerStarted","Data":"04ffd62b92930f0ab836fda16e25edc8bce7dcdcd12b1eada9eb9465e6ab0316"} Mar 13 09:46:03 crc kubenswrapper[4841]: I0313 09:46:03.137223 4841 generic.go:334] "Generic (PLEG): container finished" podID="b34fcbdd-3170-44e1-a881-867750de01ec" containerID="c30102bcf27959a8b287ceaf8a530489b1f92abf956176d87db776b0023fccad" exitCode=0 Mar 13 09:46:03 crc kubenswrapper[4841]: I0313 09:46:03.137482 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556586-rdpdd" event={"ID":"b34fcbdd-3170-44e1-a881-867750de01ec","Type":"ContainerDied","Data":"c30102bcf27959a8b287ceaf8a530489b1f92abf956176d87db776b0023fccad"} Mar 13 09:46:04 crc kubenswrapper[4841]: I0313 09:46:04.407729 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:46:04 crc kubenswrapper[4841]: I0313 09:46:04.407956 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:46:04 crc kubenswrapper[4841]: I0313 09:46:04.495065 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556586-rdpdd" Mar 13 09:46:04 crc kubenswrapper[4841]: I0313 09:46:04.565421 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk8r5\" (UniqueName: \"kubernetes.io/projected/b34fcbdd-3170-44e1-a881-867750de01ec-kube-api-access-kk8r5\") pod \"b34fcbdd-3170-44e1-a881-867750de01ec\" (UID: \"b34fcbdd-3170-44e1-a881-867750de01ec\") " Mar 13 09:46:04 crc kubenswrapper[4841]: I0313 09:46:04.571001 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b34fcbdd-3170-44e1-a881-867750de01ec-kube-api-access-kk8r5" (OuterVolumeSpecName: "kube-api-access-kk8r5") pod "b34fcbdd-3170-44e1-a881-867750de01ec" (UID: "b34fcbdd-3170-44e1-a881-867750de01ec"). InnerVolumeSpecName "kube-api-access-kk8r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:46:04 crc kubenswrapper[4841]: I0313 09:46:04.668123 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk8r5\" (UniqueName: \"kubernetes.io/projected/b34fcbdd-3170-44e1-a881-867750de01ec-kube-api-access-kk8r5\") on node \"crc\" DevicePath \"\"" Mar 13 09:46:05 crc kubenswrapper[4841]: I0313 09:46:05.173016 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556586-rdpdd" event={"ID":"b34fcbdd-3170-44e1-a881-867750de01ec","Type":"ContainerDied","Data":"04ffd62b92930f0ab836fda16e25edc8bce7dcdcd12b1eada9eb9465e6ab0316"} Mar 13 09:46:05 crc kubenswrapper[4841]: I0313 09:46:05.173080 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04ffd62b92930f0ab836fda16e25edc8bce7dcdcd12b1eada9eb9465e6ab0316" Mar 13 09:46:05 crc kubenswrapper[4841]: I0313 09:46:05.173180 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556586-rdpdd" Mar 13 09:46:05 crc kubenswrapper[4841]: I0313 09:46:05.580357 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556580-t2827"] Mar 13 09:46:05 crc kubenswrapper[4841]: I0313 09:46:05.589049 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556580-t2827"] Mar 13 09:46:06 crc kubenswrapper[4841]: I0313 09:46:06.006901 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30c0eb23-5b61-4053-bd04-3948d2c3eb16" path="/var/lib/kubelet/pods/30c0eb23-5b61-4053-bd04-3948d2c3eb16/volumes" Mar 13 09:46:34 crc kubenswrapper[4841]: I0313 09:46:34.431345 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:46:34 crc kubenswrapper[4841]: I0313 09:46:34.432299 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:46:36 crc kubenswrapper[4841]: I0313 09:46:36.135091 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nwq78"] Mar 13 09:46:36 crc kubenswrapper[4841]: E0313 09:46:36.135929 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34fcbdd-3170-44e1-a881-867750de01ec" containerName="oc" Mar 13 09:46:36 crc kubenswrapper[4841]: I0313 09:46:36.135946 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34fcbdd-3170-44e1-a881-867750de01ec" containerName="oc" Mar 13 09:46:36 crc kubenswrapper[4841]: I0313 09:46:36.136226 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b34fcbdd-3170-44e1-a881-867750de01ec" containerName="oc" Mar 13 09:46:36 crc kubenswrapper[4841]: I0313 09:46:36.138017 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwq78" Mar 13 09:46:36 crc kubenswrapper[4841]: I0313 09:46:36.150427 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nwq78"] Mar 13 09:46:36 crc kubenswrapper[4841]: I0313 09:46:36.206842 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7206f12e-0ba0-4233-af37-e3563ac032c1-utilities\") pod \"redhat-operators-nwq78\" (UID: \"7206f12e-0ba0-4233-af37-e3563ac032c1\") " pod="openshift-marketplace/redhat-operators-nwq78" Mar 13 09:46:36 crc kubenswrapper[4841]: I0313 09:46:36.206914 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgtz7\" (UniqueName: \"kubernetes.io/projected/7206f12e-0ba0-4233-af37-e3563ac032c1-kube-api-access-xgtz7\") pod \"redhat-operators-nwq78\" (UID: \"7206f12e-0ba0-4233-af37-e3563ac032c1\") " pod="openshift-marketplace/redhat-operators-nwq78" Mar 13 09:46:36 crc kubenswrapper[4841]: I0313 09:46:36.207008 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7206f12e-0ba0-4233-af37-e3563ac032c1-catalog-content\") pod \"redhat-operators-nwq78\" (UID: \"7206f12e-0ba0-4233-af37-e3563ac032c1\") " pod="openshift-marketplace/redhat-operators-nwq78" Mar 13 09:46:36 crc kubenswrapper[4841]: I0313 09:46:36.309141 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7206f12e-0ba0-4233-af37-e3563ac032c1-utilities\") pod \"redhat-operators-nwq78\" (UID: \"7206f12e-0ba0-4233-af37-e3563ac032c1\") " pod="openshift-marketplace/redhat-operators-nwq78" Mar 13 09:46:36 crc kubenswrapper[4841]: I0313 09:46:36.309219 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgtz7\" (UniqueName: \"kubernetes.io/projected/7206f12e-0ba0-4233-af37-e3563ac032c1-kube-api-access-xgtz7\") pod \"redhat-operators-nwq78\" (UID: \"7206f12e-0ba0-4233-af37-e3563ac032c1\") " pod="openshift-marketplace/redhat-operators-nwq78" Mar 13 09:46:36 crc kubenswrapper[4841]: I0313 09:46:36.309313 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7206f12e-0ba0-4233-af37-e3563ac032c1-catalog-content\") pod \"redhat-operators-nwq78\" (UID: \"7206f12e-0ba0-4233-af37-e3563ac032c1\") " pod="openshift-marketplace/redhat-operators-nwq78" Mar 13 09:46:36 crc kubenswrapper[4841]: I0313 09:46:36.310122 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7206f12e-0ba0-4233-af37-e3563ac032c1-catalog-content\") pod \"redhat-operators-nwq78\" (UID: \"7206f12e-0ba0-4233-af37-e3563ac032c1\") " pod="openshift-marketplace/redhat-operators-nwq78" Mar 13 09:46:36 crc kubenswrapper[4841]: I0313 09:46:36.310227 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7206f12e-0ba0-4233-af37-e3563ac032c1-utilities\") pod \"redhat-operators-nwq78\" (UID: \"7206f12e-0ba0-4233-af37-e3563ac032c1\") " pod="openshift-marketplace/redhat-operators-nwq78" Mar 13 09:46:36 crc kubenswrapper[4841]: I0313 09:46:36.333346 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgtz7\" (UniqueName: \"kubernetes.io/projected/7206f12e-0ba0-4233-af37-e3563ac032c1-kube-api-access-xgtz7\") pod \"redhat-operators-nwq78\" (UID: \"7206f12e-0ba0-4233-af37-e3563ac032c1\") " pod="openshift-marketplace/redhat-operators-nwq78" Mar 13 09:46:36 crc kubenswrapper[4841]: I0313 09:46:36.476907 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwq78" Mar 13 09:46:36 crc kubenswrapper[4841]: I0313 09:46:36.975590 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nwq78"] Mar 13 09:46:37 crc kubenswrapper[4841]: I0313 09:46:37.469787 4841 generic.go:334] "Generic (PLEG): container finished" podID="7206f12e-0ba0-4233-af37-e3563ac032c1" containerID="84128162f225fea4f7bdac1c87a885739462338ab0bc1e39c20dea641adafd29" exitCode=0 Mar 13 09:46:37 crc kubenswrapper[4841]: I0313 09:46:37.469847 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwq78" event={"ID":"7206f12e-0ba0-4233-af37-e3563ac032c1","Type":"ContainerDied","Data":"84128162f225fea4f7bdac1c87a885739462338ab0bc1e39c20dea641adafd29"} Mar 13 09:46:37 crc kubenswrapper[4841]: I0313 09:46:37.470093 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwq78" event={"ID":"7206f12e-0ba0-4233-af37-e3563ac032c1","Type":"ContainerStarted","Data":"fda60bba6fde943e0cb0bd84fd0ad1c828901145b4b1c587f1a4cd605860f2aa"} Mar 13 09:46:38 crc kubenswrapper[4841]: I0313 09:46:38.483391 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwq78" event={"ID":"7206f12e-0ba0-4233-af37-e3563ac032c1","Type":"ContainerStarted","Data":"8b77b1a88a0f5c1f6bfa0bbdca78f6ed828bd04e96d9cadfff9fc2d4f9781191"} Mar 13 09:46:39 crc kubenswrapper[4841]: I0313 09:46:39.494751 4841 generic.go:334] "Generic (PLEG): container finished" podID="7206f12e-0ba0-4233-af37-e3563ac032c1" containerID="8b77b1a88a0f5c1f6bfa0bbdca78f6ed828bd04e96d9cadfff9fc2d4f9781191" exitCode=0 Mar 13 09:46:39 crc kubenswrapper[4841]: I0313 09:46:39.494857 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwq78" event={"ID":"7206f12e-0ba0-4233-af37-e3563ac032c1","Type":"ContainerDied","Data":"8b77b1a88a0f5c1f6bfa0bbdca78f6ed828bd04e96d9cadfff9fc2d4f9781191"} Mar 13 09:46:40 crc kubenswrapper[4841]: I0313 09:46:40.530751 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwq78" event={"ID":"7206f12e-0ba0-4233-af37-e3563ac032c1","Type":"ContainerStarted","Data":"a5d47cfb71f9fc3783443fc09ea5efebb28e7573fed1d14edf52329080cd0ea6"} Mar 13 09:46:40 crc kubenswrapper[4841]: I0313 09:46:40.554844 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nwq78" podStartSLOduration=2.101316113 podStartE2EDuration="4.554821947s" podCreationTimestamp="2026-03-13 09:46:36 +0000 UTC" firstStartedPulling="2026-03-13 09:46:37.47237999 +0000 UTC m=+2080.202280171" lastFinishedPulling="2026-03-13 09:46:39.925885804 +0000 UTC m=+2082.655786005" observedRunningTime="2026-03-13 09:46:40.549993869 +0000 UTC m=+2083.279894060" watchObservedRunningTime="2026-03-13 09:46:40.554821947 +0000 UTC m=+2083.284722148" Mar 13 09:46:42 crc kubenswrapper[4841]: I0313 09:46:42.551215 4841 generic.go:334] "Generic (PLEG): container finished" podID="de039f4c-0550-4464-b901-a624fac40281" containerID="4ecbaa8d04c8264a37db750e83091deb21a25e7027cf1b650cffaf2010d14562" exitCode=0 Mar 13 09:46:42 crc kubenswrapper[4841]: I0313 09:46:42.551287 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" event={"ID":"de039f4c-0550-4464-b901-a624fac40281","Type":"ContainerDied","Data":"4ecbaa8d04c8264a37db750e83091deb21a25e7027cf1b650cffaf2010d14562"} Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.049142 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.179488 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de039f4c-0550-4464-b901-a624fac40281-ovncontroller-config-0\") pod \"de039f4c-0550-4464-b901-a624fac40281\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.179663 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtx5v\" (UniqueName: \"kubernetes.io/projected/de039f4c-0550-4464-b901-a624fac40281-kube-api-access-mtx5v\") pod \"de039f4c-0550-4464-b901-a624fac40281\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.179785 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de039f4c-0550-4464-b901-a624fac40281-ovn-combined-ca-bundle\") pod \"de039f4c-0550-4464-b901-a624fac40281\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.179831 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de039f4c-0550-4464-b901-a624fac40281-inventory\") pod \"de039f4c-0550-4464-b901-a624fac40281\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.179875 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de039f4c-0550-4464-b901-a624fac40281-ssh-key-openstack-edpm-ipam\") pod \"de039f4c-0550-4464-b901-a624fac40281\" (UID: \"de039f4c-0550-4464-b901-a624fac40281\") " Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.186308 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de039f4c-0550-4464-b901-a624fac40281-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "de039f4c-0550-4464-b901-a624fac40281" (UID: "de039f4c-0550-4464-b901-a624fac40281"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.186625 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de039f4c-0550-4464-b901-a624fac40281-kube-api-access-mtx5v" (OuterVolumeSpecName: "kube-api-access-mtx5v") pod "de039f4c-0550-4464-b901-a624fac40281" (UID: "de039f4c-0550-4464-b901-a624fac40281"). InnerVolumeSpecName "kube-api-access-mtx5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.208359 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de039f4c-0550-4464-b901-a624fac40281-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "de039f4c-0550-4464-b901-a624fac40281" (UID: "de039f4c-0550-4464-b901-a624fac40281"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.209286 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de039f4c-0550-4464-b901-a624fac40281-inventory" (OuterVolumeSpecName: "inventory") pod "de039f4c-0550-4464-b901-a624fac40281" (UID: "de039f4c-0550-4464-b901-a624fac40281"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.210025 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de039f4c-0550-4464-b901-a624fac40281-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "de039f4c-0550-4464-b901-a624fac40281" (UID: "de039f4c-0550-4464-b901-a624fac40281"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.284494 4841 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de039f4c-0550-4464-b901-a624fac40281-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.284532 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtx5v\" (UniqueName: \"kubernetes.io/projected/de039f4c-0550-4464-b901-a624fac40281-kube-api-access-mtx5v\") on node \"crc\" DevicePath \"\"" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.284544 4841 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de039f4c-0550-4464-b901-a624fac40281-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.284552 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de039f4c-0550-4464-b901-a624fac40281-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.284561 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de039f4c-0550-4464-b901-a624fac40281-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.558092 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-szx4f"] Mar 13 09:46:44 crc kubenswrapper[4841]: E0313 09:46:44.558675 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de039f4c-0550-4464-b901-a624fac40281" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.558703 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="de039f4c-0550-4464-b901-a624fac40281" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.558934 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="de039f4c-0550-4464-b901-a624fac40281" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.560658 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szx4f" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.575964 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" event={"ID":"de039f4c-0550-4464-b901-a624fac40281","Type":"ContainerDied","Data":"a15b3d33df5098bcbe342b92609cffb919a622a1c0fad3745e37fa65ae128ce0"} Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.576021 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a15b3d33df5098bcbe342b92609cffb919a622a1c0fad3745e37fa65ae128ce0" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.576143 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-59f4q" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.619720 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-szx4f"] Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.697496 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb0bb4de-58c3-4d6e-a22d-735e9346f228-catalog-content\") pod \"community-operators-szx4f\" (UID: \"eb0bb4de-58c3-4d6e-a22d-735e9346f228\") " pod="openshift-marketplace/community-operators-szx4f" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.697566 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d58lh\" (UniqueName: \"kubernetes.io/projected/eb0bb4de-58c3-4d6e-a22d-735e9346f228-kube-api-access-d58lh\") pod \"community-operators-szx4f\" (UID: \"eb0bb4de-58c3-4d6e-a22d-735e9346f228\") " pod="openshift-marketplace/community-operators-szx4f" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.697685 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb0bb4de-58c3-4d6e-a22d-735e9346f228-utilities\") pod \"community-operators-szx4f\" (UID: \"eb0bb4de-58c3-4d6e-a22d-735e9346f228\") " pod="openshift-marketplace/community-operators-szx4f" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.750815 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv"] Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.752705 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.764304 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv"] Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.767395 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.767471 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rv6jt" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.767665 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.767799 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.768364 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.771028 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.801513 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb0bb4de-58c3-4d6e-a22d-735e9346f228-utilities\") pod \"community-operators-szx4f\" (UID: \"eb0bb4de-58c3-4d6e-a22d-735e9346f228\") " pod="openshift-marketplace/community-operators-szx4f" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.801642 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb0bb4de-58c3-4d6e-a22d-735e9346f228-catalog-content\") pod \"community-operators-szx4f\" (UID: \"eb0bb4de-58c3-4d6e-a22d-735e9346f228\") " pod="openshift-marketplace/community-operators-szx4f" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.801683 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d58lh\" (UniqueName: \"kubernetes.io/projected/eb0bb4de-58c3-4d6e-a22d-735e9346f228-kube-api-access-d58lh\") pod \"community-operators-szx4f\" (UID: \"eb0bb4de-58c3-4d6e-a22d-735e9346f228\") " pod="openshift-marketplace/community-operators-szx4f" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.803178 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb0bb4de-58c3-4d6e-a22d-735e9346f228-catalog-content\") pod \"community-operators-szx4f\" (UID: \"eb0bb4de-58c3-4d6e-a22d-735e9346f228\") " pod="openshift-marketplace/community-operators-szx4f" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.803617 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb0bb4de-58c3-4d6e-a22d-735e9346f228-utilities\") pod \"community-operators-szx4f\" (UID: \"eb0bb4de-58c3-4d6e-a22d-735e9346f228\") " pod="openshift-marketplace/community-operators-szx4f" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.828533 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d58lh\" (UniqueName: \"kubernetes.io/projected/eb0bb4de-58c3-4d6e-a22d-735e9346f228-kube-api-access-d58lh\") pod \"community-operators-szx4f\" (UID: \"eb0bb4de-58c3-4d6e-a22d-735e9346f228\") " pod="openshift-marketplace/community-operators-szx4f" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.898722 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szx4f" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.904079 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.904208 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.904282 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.904396 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.904523 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bjv7\" (UniqueName: \"kubernetes.io/projected/dc97298b-a706-488a-9ea1-e90de447c754-kube-api-access-9bjv7\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:44 crc kubenswrapper[4841]: I0313 09:46:44.904651 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:45 crc kubenswrapper[4841]: I0313 09:46:45.007837 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:45 crc kubenswrapper[4841]: I0313 09:46:45.008841 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bjv7\" (UniqueName: \"kubernetes.io/projected/dc97298b-a706-488a-9ea1-e90de447c754-kube-api-access-9bjv7\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:45 crc kubenswrapper[4841]: I0313 09:46:45.008988 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:45 crc kubenswrapper[4841]: I0313 09:46:45.009057 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:45 crc kubenswrapper[4841]: I0313 09:46:45.009108 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:45 crc kubenswrapper[4841]: I0313 09:46:45.009153 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:45 crc kubenswrapper[4841]: I0313 09:46:45.022483 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:45 crc kubenswrapper[4841]: I0313 09:46:45.023308 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:45 crc kubenswrapper[4841]: I0313 09:46:45.024131 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:45 crc kubenswrapper[4841]: I0313 09:46:45.024387 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:45 crc kubenswrapper[4841]: I0313 09:46:45.024611 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:45 crc kubenswrapper[4841]: I0313 09:46:45.047493 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bjv7\" (UniqueName: \"kubernetes.io/projected/dc97298b-a706-488a-9ea1-e90de447c754-kube-api-access-9bjv7\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:45 crc kubenswrapper[4841]: I0313 09:46:45.090054 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:46:45 crc kubenswrapper[4841]: I0313 09:46:45.457754 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-szx4f"] Mar 13 09:46:45 crc kubenswrapper[4841]: I0313 09:46:45.587164 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szx4f" event={"ID":"eb0bb4de-58c3-4d6e-a22d-735e9346f228","Type":"ContainerStarted","Data":"ad8a7a3bf11d07a9d1a1980319066539a9c6b69541fb774d01d829563d7558d1"} Mar 13 09:46:45 crc kubenswrapper[4841]: I0313 09:46:45.773611 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv"] Mar 13 09:46:45 crc kubenswrapper[4841]: W0313 09:46:45.774189 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc97298b_a706_488a_9ea1_e90de447c754.slice/crio-b9b82f1861acbcc9a2652556b2af68d756b7bf4185f372ef7b7dd78f615ee3ca WatchSource:0}: Error finding container b9b82f1861acbcc9a2652556b2af68d756b7bf4185f372ef7b7dd78f615ee3ca: Status 404 returned error can't find the container with id b9b82f1861acbcc9a2652556b2af68d756b7bf4185f372ef7b7dd78f615ee3ca Mar 13 09:46:46 crc kubenswrapper[4841]: I0313 09:46:46.477582 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nwq78" Mar 13 09:46:46 crc kubenswrapper[4841]: I0313 09:46:46.478791 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nwq78" Mar 13 09:46:46 crc kubenswrapper[4841]: I0313 09:46:46.598759 4841 generic.go:334] "Generic (PLEG): container finished" podID="eb0bb4de-58c3-4d6e-a22d-735e9346f228" containerID="d1eca9c57821c0287578b519933f0c25cfdc913044c7867265ea77f670ea12f9" exitCode=0 Mar 13 09:46:46 crc kubenswrapper[4841]: I0313 09:46:46.598852 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szx4f" event={"ID":"eb0bb4de-58c3-4d6e-a22d-735e9346f228","Type":"ContainerDied","Data":"d1eca9c57821c0287578b519933f0c25cfdc913044c7867265ea77f670ea12f9"} Mar 13 09:46:46 crc kubenswrapper[4841]: I0313 09:46:46.601358 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" event={"ID":"dc97298b-a706-488a-9ea1-e90de447c754","Type":"ContainerStarted","Data":"b9b82f1861acbcc9a2652556b2af68d756b7bf4185f372ef7b7dd78f615ee3ca"} Mar 13 09:46:47 crc kubenswrapper[4841]: I0313 09:46:47.532888 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nwq78" podUID="7206f12e-0ba0-4233-af37-e3563ac032c1" containerName="registry-server" probeResult="failure" output=< Mar 13 09:46:47 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Mar 13 09:46:47 crc kubenswrapper[4841]: > Mar 13 09:46:47 crc kubenswrapper[4841]: I0313 09:46:47.613967 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" event={"ID":"dc97298b-a706-488a-9ea1-e90de447c754","Type":"ContainerStarted","Data":"adf0e6f0c0f3e2504209cc3cbbf4d410eaf6540d43f9aa29ebbd6d2e889b1426"} Mar 13 09:46:47 crc kubenswrapper[4841]: I0313 09:46:47.658835 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" podStartSLOduration=2.836307147 podStartE2EDuration="3.658805988s" podCreationTimestamp="2026-03-13 09:46:44 +0000 UTC" firstStartedPulling="2026-03-13 09:46:45.777047513 +0000 UTC m=+2088.506947704" lastFinishedPulling="2026-03-13 09:46:46.599546354 +0000 UTC m=+2089.329446545" observedRunningTime="2026-03-13 09:46:47.635142851 +0000 UTC m=+2090.365043072" watchObservedRunningTime="2026-03-13 09:46:47.658805988 +0000 UTC m=+2090.388706179" Mar 13 09:46:52 crc kubenswrapper[4841]: I0313 09:46:52.670031 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szx4f" event={"ID":"eb0bb4de-58c3-4d6e-a22d-735e9346f228","Type":"ContainerStarted","Data":"9d257d9a5d34d469cbe7efda8689038a1fdec0cccdf59561b3d81d79619f26bf"} Mar 13 09:46:53 crc kubenswrapper[4841]: I0313 09:46:53.680052 4841 generic.go:334] "Generic (PLEG): container finished" podID="eb0bb4de-58c3-4d6e-a22d-735e9346f228" containerID="9d257d9a5d34d469cbe7efda8689038a1fdec0cccdf59561b3d81d79619f26bf" exitCode=0 Mar 13 09:46:53 crc kubenswrapper[4841]: I0313 09:46:53.680102 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szx4f" event={"ID":"eb0bb4de-58c3-4d6e-a22d-735e9346f228","Type":"ContainerDied","Data":"9d257d9a5d34d469cbe7efda8689038a1fdec0cccdf59561b3d81d79619f26bf"} Mar 13 09:46:55 crc kubenswrapper[4841]: I0313 09:46:55.700121 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szx4f" event={"ID":"eb0bb4de-58c3-4d6e-a22d-735e9346f228","Type":"ContainerStarted","Data":"00fe213aa62b55436a947788531e56f05fa6dd546dc7b5c1fe7d2b1125134c0c"} Mar 13 09:46:56 crc kubenswrapper[4841]: I0313 09:46:56.524446 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nwq78" Mar 13 09:46:56 crc kubenswrapper[4841]: I0313 09:46:56.548582 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-szx4f" podStartSLOduration=4.634655433 podStartE2EDuration="12.548563479s" podCreationTimestamp="2026-03-13 09:46:44 +0000 UTC" firstStartedPulling="2026-03-13 09:46:46.602879926 +0000 UTC m=+2089.332780117" lastFinishedPulling="2026-03-13 09:46:54.516787952 +0000 UTC m=+2097.246688163" observedRunningTime="2026-03-13 09:46:55.723114607 +0000 UTC m=+2098.453014838" watchObservedRunningTime="2026-03-13 09:46:56.548563479 +0000 UTC m=+2099.278463670" Mar 13 09:46:56 crc kubenswrapper[4841]: I0313 09:46:56.569624 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nwq78" Mar 13 09:46:56 crc kubenswrapper[4841]: I0313 09:46:56.944338 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nwq78"] Mar 13 09:46:57 crc kubenswrapper[4841]: I0313 09:46:57.717799 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nwq78" podUID="7206f12e-0ba0-4233-af37-e3563ac032c1" containerName="registry-server" containerID="cri-o://a5d47cfb71f9fc3783443fc09ea5efebb28e7573fed1d14edf52329080cd0ea6" gracePeriod=2 Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.150531 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwq78" Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.185378 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7206f12e-0ba0-4233-af37-e3563ac032c1-catalog-content\") pod \"7206f12e-0ba0-4233-af37-e3563ac032c1\" (UID: \"7206f12e-0ba0-4233-af37-e3563ac032c1\") " Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.185527 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7206f12e-0ba0-4233-af37-e3563ac032c1-utilities\") pod \"7206f12e-0ba0-4233-af37-e3563ac032c1\" (UID: \"7206f12e-0ba0-4233-af37-e3563ac032c1\") " Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.185610 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgtz7\" (UniqueName: \"kubernetes.io/projected/7206f12e-0ba0-4233-af37-e3563ac032c1-kube-api-access-xgtz7\") pod \"7206f12e-0ba0-4233-af37-e3563ac032c1\" (UID: \"7206f12e-0ba0-4233-af37-e3563ac032c1\") " Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.186918 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7206f12e-0ba0-4233-af37-e3563ac032c1-utilities" (OuterVolumeSpecName: "utilities") pod "7206f12e-0ba0-4233-af37-e3563ac032c1" (UID: "7206f12e-0ba0-4233-af37-e3563ac032c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.191506 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7206f12e-0ba0-4233-af37-e3563ac032c1-kube-api-access-xgtz7" (OuterVolumeSpecName: "kube-api-access-xgtz7") pod "7206f12e-0ba0-4233-af37-e3563ac032c1" (UID: "7206f12e-0ba0-4233-af37-e3563ac032c1"). InnerVolumeSpecName "kube-api-access-xgtz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.287061 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7206f12e-0ba0-4233-af37-e3563ac032c1-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.287090 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgtz7\" (UniqueName: \"kubernetes.io/projected/7206f12e-0ba0-4233-af37-e3563ac032c1-kube-api-access-xgtz7\") on node \"crc\" DevicePath \"\"" Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.302301 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7206f12e-0ba0-4233-af37-e3563ac032c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7206f12e-0ba0-4233-af37-e3563ac032c1" (UID: "7206f12e-0ba0-4233-af37-e3563ac032c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.389537 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7206f12e-0ba0-4233-af37-e3563ac032c1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.755948 4841 generic.go:334] "Generic (PLEG): container finished" podID="7206f12e-0ba0-4233-af37-e3563ac032c1" containerID="a5d47cfb71f9fc3783443fc09ea5efebb28e7573fed1d14edf52329080cd0ea6" exitCode=0 Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.757235 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwq78" event={"ID":"7206f12e-0ba0-4233-af37-e3563ac032c1","Type":"ContainerDied","Data":"a5d47cfb71f9fc3783443fc09ea5efebb28e7573fed1d14edf52329080cd0ea6"} Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.757397 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwq78" Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.757442 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwq78" event={"ID":"7206f12e-0ba0-4233-af37-e3563ac032c1","Type":"ContainerDied","Data":"fda60bba6fde943e0cb0bd84fd0ad1c828901145b4b1c587f1a4cd605860f2aa"} Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.757470 4841 scope.go:117] "RemoveContainer" containerID="a5d47cfb71f9fc3783443fc09ea5efebb28e7573fed1d14edf52329080cd0ea6" Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.778193 4841 scope.go:117] "RemoveContainer" containerID="8b77b1a88a0f5c1f6bfa0bbdca78f6ed828bd04e96d9cadfff9fc2d4f9781191" Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.794586 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nwq78"] Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.803444 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nwq78"] Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.818706 4841 scope.go:117] "RemoveContainer" containerID="84128162f225fea4f7bdac1c87a885739462338ab0bc1e39c20dea641adafd29" Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.850654 4841 scope.go:117] "RemoveContainer" containerID="a5d47cfb71f9fc3783443fc09ea5efebb28e7573fed1d14edf52329080cd0ea6" Mar 13 09:46:58 crc kubenswrapper[4841]: E0313 09:46:58.851100 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5d47cfb71f9fc3783443fc09ea5efebb28e7573fed1d14edf52329080cd0ea6\": container with ID starting with a5d47cfb71f9fc3783443fc09ea5efebb28e7573fed1d14edf52329080cd0ea6 not found: ID does not exist" containerID="a5d47cfb71f9fc3783443fc09ea5efebb28e7573fed1d14edf52329080cd0ea6" Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.851131 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5d47cfb71f9fc3783443fc09ea5efebb28e7573fed1d14edf52329080cd0ea6"} err="failed to get container status \"a5d47cfb71f9fc3783443fc09ea5efebb28e7573fed1d14edf52329080cd0ea6\": rpc error: code = NotFound desc = could not find container \"a5d47cfb71f9fc3783443fc09ea5efebb28e7573fed1d14edf52329080cd0ea6\": container with ID starting with a5d47cfb71f9fc3783443fc09ea5efebb28e7573fed1d14edf52329080cd0ea6 not found: ID does not exist" Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.851155 4841 scope.go:117] "RemoveContainer" containerID="8b77b1a88a0f5c1f6bfa0bbdca78f6ed828bd04e96d9cadfff9fc2d4f9781191" Mar 13 09:46:58 crc kubenswrapper[4841]: E0313 09:46:58.851525 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b77b1a88a0f5c1f6bfa0bbdca78f6ed828bd04e96d9cadfff9fc2d4f9781191\": container with ID starting with 8b77b1a88a0f5c1f6bfa0bbdca78f6ed828bd04e96d9cadfff9fc2d4f9781191 not found: ID does not exist" containerID="8b77b1a88a0f5c1f6bfa0bbdca78f6ed828bd04e96d9cadfff9fc2d4f9781191" Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.851564 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b77b1a88a0f5c1f6bfa0bbdca78f6ed828bd04e96d9cadfff9fc2d4f9781191"} err="failed to get container status \"8b77b1a88a0f5c1f6bfa0bbdca78f6ed828bd04e96d9cadfff9fc2d4f9781191\": rpc error: code = NotFound desc = could not find container \"8b77b1a88a0f5c1f6bfa0bbdca78f6ed828bd04e96d9cadfff9fc2d4f9781191\": container with ID starting with 8b77b1a88a0f5c1f6bfa0bbdca78f6ed828bd04e96d9cadfff9fc2d4f9781191 not found: ID does not exist" Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.851589 4841 scope.go:117] "RemoveContainer" containerID="84128162f225fea4f7bdac1c87a885739462338ab0bc1e39c20dea641adafd29" Mar 13 09:46:58 crc kubenswrapper[4841]: E0313 09:46:58.851897 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84128162f225fea4f7bdac1c87a885739462338ab0bc1e39c20dea641adafd29\": container with ID starting with 84128162f225fea4f7bdac1c87a885739462338ab0bc1e39c20dea641adafd29 not found: ID does not exist" containerID="84128162f225fea4f7bdac1c87a885739462338ab0bc1e39c20dea641adafd29" Mar 13 09:46:58 crc kubenswrapper[4841]: I0313 09:46:58.851941 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84128162f225fea4f7bdac1c87a885739462338ab0bc1e39c20dea641adafd29"} err="failed to get container status \"84128162f225fea4f7bdac1c87a885739462338ab0bc1e39c20dea641adafd29\": rpc error: code = NotFound desc = could not find container \"84128162f225fea4f7bdac1c87a885739462338ab0bc1e39c20dea641adafd29\": container with ID starting with 84128162f225fea4f7bdac1c87a885739462338ab0bc1e39c20dea641adafd29 not found: ID does not exist" Mar 13 09:47:00 crc kubenswrapper[4841]: I0313 09:47:00.004637 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7206f12e-0ba0-4233-af37-e3563ac032c1" path="/var/lib/kubelet/pods/7206f12e-0ba0-4233-af37-e3563ac032c1/volumes" Mar 13 09:47:04 crc kubenswrapper[4841]: I0313 09:47:04.204352 4841 scope.go:117] "RemoveContainer" containerID="716229f6ba66e4a5554ec47fc240cc1b304fa03725dc768f4f939f1150ab0220" Mar 13 09:47:04 crc kubenswrapper[4841]: I0313 09:47:04.407688 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:47:04 crc kubenswrapper[4841]: I0313 09:47:04.407986 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:47:04 crc kubenswrapper[4841]: I0313 09:47:04.408027 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:47:04 crc kubenswrapper[4841]: I0313 09:47:04.408546 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fcd17e9cebe8ee5c7b1749619cc08d15d1f1b747829783b5902c03af4c6172bc"} pod="openshift-machine-config-operator/machine-config-daemon-h227v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 09:47:04 crc kubenswrapper[4841]: I0313 09:47:04.408603 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" containerID="cri-o://fcd17e9cebe8ee5c7b1749619cc08d15d1f1b747829783b5902c03af4c6172bc" gracePeriod=600 Mar 13 09:47:04 crc kubenswrapper[4841]: I0313 09:47:04.807230 4841 generic.go:334] "Generic (PLEG): container finished" podID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerID="fcd17e9cebe8ee5c7b1749619cc08d15d1f1b747829783b5902c03af4c6172bc" exitCode=0 Mar 13 09:47:04 crc kubenswrapper[4841]: I0313 09:47:04.807304 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerDied","Data":"fcd17e9cebe8ee5c7b1749619cc08d15d1f1b747829783b5902c03af4c6172bc"} Mar 13 09:47:04 crc kubenswrapper[4841]: I0313 09:47:04.807606 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5"} Mar 13 09:47:04 crc kubenswrapper[4841]: I0313 09:47:04.807630 4841 scope.go:117] "RemoveContainer" containerID="61732025a614a870a436defea3b35dec639fb713afd854bb35f151a73c5cfbab" Mar 13 09:47:04 crc kubenswrapper[4841]: I0313 09:47:04.899215 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-szx4f" Mar 13 09:47:04 crc kubenswrapper[4841]: I0313 09:47:04.899345 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-szx4f" Mar 13 09:47:04 crc kubenswrapper[4841]: I0313 09:47:04.943755 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-szx4f" Mar 13 09:47:05 crc kubenswrapper[4841]: I0313 09:47:05.858365 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-szx4f" Mar 13 09:47:05 crc kubenswrapper[4841]: I0313 09:47:05.924580 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-szx4f"] Mar 13 09:47:05 crc kubenswrapper[4841]: I0313 09:47:05.973899 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d6wpz"] Mar 13 09:47:05 crc kubenswrapper[4841]: I0313 09:47:05.974136 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d6wpz" podUID="188a8141-ccd6-48e9-a3f0-4546fba14c1c" containerName="registry-server" containerID="cri-o://b983b40cafb18d7eaa6ac163273313ba550e3197abaca5c7cc7f14ced5504c7a" gracePeriod=2 Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.415606 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d6wpz" Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.540737 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/188a8141-ccd6-48e9-a3f0-4546fba14c1c-utilities\") pod \"188a8141-ccd6-48e9-a3f0-4546fba14c1c\" (UID: \"188a8141-ccd6-48e9-a3f0-4546fba14c1c\") " Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.540792 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59g5d\" (UniqueName: \"kubernetes.io/projected/188a8141-ccd6-48e9-a3f0-4546fba14c1c-kube-api-access-59g5d\") pod \"188a8141-ccd6-48e9-a3f0-4546fba14c1c\" (UID: \"188a8141-ccd6-48e9-a3f0-4546fba14c1c\") " Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.540875 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/188a8141-ccd6-48e9-a3f0-4546fba14c1c-catalog-content\") pod \"188a8141-ccd6-48e9-a3f0-4546fba14c1c\" (UID: \"188a8141-ccd6-48e9-a3f0-4546fba14c1c\") " Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.543668 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/188a8141-ccd6-48e9-a3f0-4546fba14c1c-utilities" (OuterVolumeSpecName: "utilities") pod "188a8141-ccd6-48e9-a3f0-4546fba14c1c" (UID: "188a8141-ccd6-48e9-a3f0-4546fba14c1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.557024 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/188a8141-ccd6-48e9-a3f0-4546fba14c1c-kube-api-access-59g5d" (OuterVolumeSpecName: "kube-api-access-59g5d") pod "188a8141-ccd6-48e9-a3f0-4546fba14c1c" (UID: "188a8141-ccd6-48e9-a3f0-4546fba14c1c"). InnerVolumeSpecName "kube-api-access-59g5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.629691 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/188a8141-ccd6-48e9-a3f0-4546fba14c1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "188a8141-ccd6-48e9-a3f0-4546fba14c1c" (UID: "188a8141-ccd6-48e9-a3f0-4546fba14c1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.643335 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/188a8141-ccd6-48e9-a3f0-4546fba14c1c-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.643370 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59g5d\" (UniqueName: \"kubernetes.io/projected/188a8141-ccd6-48e9-a3f0-4546fba14c1c-kube-api-access-59g5d\") on node \"crc\" DevicePath \"\"" Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.643409 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/188a8141-ccd6-48e9-a3f0-4546fba14c1c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.829023 4841 generic.go:334] "Generic (PLEG): container finished" podID="188a8141-ccd6-48e9-a3f0-4546fba14c1c" containerID="b983b40cafb18d7eaa6ac163273313ba550e3197abaca5c7cc7f14ced5504c7a" exitCode=0 Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.829079 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6wpz" event={"ID":"188a8141-ccd6-48e9-a3f0-4546fba14c1c","Type":"ContainerDied","Data":"b983b40cafb18d7eaa6ac163273313ba550e3197abaca5c7cc7f14ced5504c7a"} Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.829531 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6wpz" event={"ID":"188a8141-ccd6-48e9-a3f0-4546fba14c1c","Type":"ContainerDied","Data":"f600578a22c7cde8db21f2700f565432a02aee61b47bb1ee06859d4e7ea390e3"} Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.829559 4841 scope.go:117] "RemoveContainer" containerID="b983b40cafb18d7eaa6ac163273313ba550e3197abaca5c7cc7f14ced5504c7a" Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.829120 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d6wpz" Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.872200 4841 scope.go:117] "RemoveContainer" containerID="c26bf592c2c3bff93d5095c72eae61faad2163c12dcf531067caf839b9b0a253" Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.880888 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d6wpz"] Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.907608 4841 scope.go:117] "RemoveContainer" containerID="11619ecb1a182e5acc3fb7f30a62ae71a448091c192925113e3a3deb2842899f" Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.913187 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d6wpz"] Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.974963 4841 scope.go:117] "RemoveContainer" containerID="b983b40cafb18d7eaa6ac163273313ba550e3197abaca5c7cc7f14ced5504c7a" Mar 13 09:47:06 crc kubenswrapper[4841]: E0313 09:47:06.978393 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b983b40cafb18d7eaa6ac163273313ba550e3197abaca5c7cc7f14ced5504c7a\": container with ID starting with b983b40cafb18d7eaa6ac163273313ba550e3197abaca5c7cc7f14ced5504c7a not found: ID does not exist" containerID="b983b40cafb18d7eaa6ac163273313ba550e3197abaca5c7cc7f14ced5504c7a" Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.978559 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b983b40cafb18d7eaa6ac163273313ba550e3197abaca5c7cc7f14ced5504c7a"} err="failed to get container status \"b983b40cafb18d7eaa6ac163273313ba550e3197abaca5c7cc7f14ced5504c7a\": rpc error: code = NotFound desc = could not find container \"b983b40cafb18d7eaa6ac163273313ba550e3197abaca5c7cc7f14ced5504c7a\": container with ID starting with b983b40cafb18d7eaa6ac163273313ba550e3197abaca5c7cc7f14ced5504c7a not found: ID does not exist" Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.978696 4841 scope.go:117] "RemoveContainer" containerID="c26bf592c2c3bff93d5095c72eae61faad2163c12dcf531067caf839b9b0a253" Mar 13 09:47:06 crc kubenswrapper[4841]: E0313 09:47:06.979076 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c26bf592c2c3bff93d5095c72eae61faad2163c12dcf531067caf839b9b0a253\": container with ID starting with c26bf592c2c3bff93d5095c72eae61faad2163c12dcf531067caf839b9b0a253 not found: ID does not exist" containerID="c26bf592c2c3bff93d5095c72eae61faad2163c12dcf531067caf839b9b0a253" Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.979153 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c26bf592c2c3bff93d5095c72eae61faad2163c12dcf531067caf839b9b0a253"} err="failed to get container status \"c26bf592c2c3bff93d5095c72eae61faad2163c12dcf531067caf839b9b0a253\": rpc error: code = NotFound desc = could not find container \"c26bf592c2c3bff93d5095c72eae61faad2163c12dcf531067caf839b9b0a253\": container with ID starting with c26bf592c2c3bff93d5095c72eae61faad2163c12dcf531067caf839b9b0a253 not found: ID does not exist" Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.979226 4841 scope.go:117] "RemoveContainer" containerID="11619ecb1a182e5acc3fb7f30a62ae71a448091c192925113e3a3deb2842899f" Mar 13 09:47:06 crc kubenswrapper[4841]: E0313 09:47:06.979607 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11619ecb1a182e5acc3fb7f30a62ae71a448091c192925113e3a3deb2842899f\": container with ID starting with 11619ecb1a182e5acc3fb7f30a62ae71a448091c192925113e3a3deb2842899f not found: ID does not exist" containerID="11619ecb1a182e5acc3fb7f30a62ae71a448091c192925113e3a3deb2842899f" Mar 13 09:47:06 crc kubenswrapper[4841]: I0313 09:47:06.979692 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11619ecb1a182e5acc3fb7f30a62ae71a448091c192925113e3a3deb2842899f"} err="failed to get container status \"11619ecb1a182e5acc3fb7f30a62ae71a448091c192925113e3a3deb2842899f\": rpc error: code = NotFound desc = could not find container \"11619ecb1a182e5acc3fb7f30a62ae71a448091c192925113e3a3deb2842899f\": container with ID starting with 11619ecb1a182e5acc3fb7f30a62ae71a448091c192925113e3a3deb2842899f not found: ID does not exist" Mar 13 09:47:07 crc kubenswrapper[4841]: E0313 09:47:07.126590 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod188a8141_ccd6_48e9_a3f0_4546fba14c1c.slice/crio-f600578a22c7cde8db21f2700f565432a02aee61b47bb1ee06859d4e7ea390e3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod188a8141_ccd6_48e9_a3f0_4546fba14c1c.slice\": RecentStats: unable to find data in memory cache]" Mar 13 09:47:08 crc kubenswrapper[4841]: I0313 09:47:08.007886 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="188a8141-ccd6-48e9-a3f0-4546fba14c1c" path="/var/lib/kubelet/pods/188a8141-ccd6-48e9-a3f0-4546fba14c1c/volumes" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.400291 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tc8v9"] Mar 13 09:47:12 crc kubenswrapper[4841]: E0313 09:47:12.402339 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188a8141-ccd6-48e9-a3f0-4546fba14c1c" containerName="registry-server" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.402446 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="188a8141-ccd6-48e9-a3f0-4546fba14c1c" containerName="registry-server" Mar 13 09:47:12 crc kubenswrapper[4841]: E0313 09:47:12.402529 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188a8141-ccd6-48e9-a3f0-4546fba14c1c" containerName="extract-content" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.402604 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="188a8141-ccd6-48e9-a3f0-4546fba14c1c" containerName="extract-content" Mar 13 09:47:12 crc kubenswrapper[4841]: E0313 09:47:12.402689 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7206f12e-0ba0-4233-af37-e3563ac032c1" containerName="extract-utilities" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.402772 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7206f12e-0ba0-4233-af37-e3563ac032c1" containerName="extract-utilities" Mar 13 09:47:12 crc kubenswrapper[4841]: E0313 09:47:12.402862 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188a8141-ccd6-48e9-a3f0-4546fba14c1c" containerName="extract-utilities" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.402943 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="188a8141-ccd6-48e9-a3f0-4546fba14c1c" containerName="extract-utilities" Mar 13 09:47:12 crc kubenswrapper[4841]: E0313 09:47:12.403058 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7206f12e-0ba0-4233-af37-e3563ac032c1" containerName="registry-server" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.403139 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7206f12e-0ba0-4233-af37-e3563ac032c1" containerName="registry-server" Mar 13 09:47:12 crc kubenswrapper[4841]: E0313 09:47:12.403221 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7206f12e-0ba0-4233-af37-e3563ac032c1" containerName="extract-content" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.403327 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7206f12e-0ba0-4233-af37-e3563ac032c1" containerName="extract-content" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.403640 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="188a8141-ccd6-48e9-a3f0-4546fba14c1c" containerName="registry-server" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.403767 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7206f12e-0ba0-4233-af37-e3563ac032c1" containerName="registry-server" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.405758 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tc8v9" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.430623 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tc8v9"] Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.477122 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece-utilities\") pod \"certified-operators-tc8v9\" (UID: \"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece\") " pod="openshift-marketplace/certified-operators-tc8v9" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.477323 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvl2w\" (UniqueName: \"kubernetes.io/projected/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece-kube-api-access-cvl2w\") pod \"certified-operators-tc8v9\" (UID: \"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece\") " pod="openshift-marketplace/certified-operators-tc8v9" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.477387 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece-catalog-content\") pod \"certified-operators-tc8v9\" (UID: \"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece\") " pod="openshift-marketplace/certified-operators-tc8v9" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.579054 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece-utilities\") pod \"certified-operators-tc8v9\" (UID: \"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece\") " pod="openshift-marketplace/certified-operators-tc8v9" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.579134 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvl2w\" (UniqueName: \"kubernetes.io/projected/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece-kube-api-access-cvl2w\") pod \"certified-operators-tc8v9\" (UID: \"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece\") " pod="openshift-marketplace/certified-operators-tc8v9" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.579163 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece-catalog-content\") pod \"certified-operators-tc8v9\" (UID: \"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece\") " pod="openshift-marketplace/certified-operators-tc8v9" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.579616 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece-catalog-content\") pod \"certified-operators-tc8v9\" (UID: \"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece\") " pod="openshift-marketplace/certified-operators-tc8v9" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.579695 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece-utilities\") pod \"certified-operators-tc8v9\" (UID: \"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece\") " pod="openshift-marketplace/certified-operators-tc8v9" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.608388 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvl2w\" (UniqueName: \"kubernetes.io/projected/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece-kube-api-access-cvl2w\") pod \"certified-operators-tc8v9\" (UID: \"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece\") " pod="openshift-marketplace/certified-operators-tc8v9" Mar 13 09:47:12 crc kubenswrapper[4841]: I0313 09:47:12.728783 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tc8v9" Mar 13 09:47:13 crc kubenswrapper[4841]: I0313 09:47:13.249908 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tc8v9"] Mar 13 09:47:13 crc kubenswrapper[4841]: W0313 09:47:13.264779 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e8f1710_c219_4f3a_8e5e_d50ba2ec1ece.slice/crio-11841efbe0e843fe91620f629d0b36ec8ff812f9e7c2e11dd293c18d00a72509 WatchSource:0}: Error finding container 11841efbe0e843fe91620f629d0b36ec8ff812f9e7c2e11dd293c18d00a72509: Status 404 returned error can't find the container with id 11841efbe0e843fe91620f629d0b36ec8ff812f9e7c2e11dd293c18d00a72509 Mar 13 09:47:13 crc kubenswrapper[4841]: I0313 09:47:13.902684 4841 generic.go:334] "Generic (PLEG): container finished" podID="7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece" containerID="bb656b5ad59fa7a84891dd681a1449b3fb68b272f24a77d6377db05b4a459d8e" exitCode=0 Mar 13 09:47:13 crc kubenswrapper[4841]: I0313 09:47:13.902837 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tc8v9" event={"ID":"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece","Type":"ContainerDied","Data":"bb656b5ad59fa7a84891dd681a1449b3fb68b272f24a77d6377db05b4a459d8e"} Mar 13 09:47:13 crc kubenswrapper[4841]: I0313 09:47:13.903335 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tc8v9" event={"ID":"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece","Type":"ContainerStarted","Data":"11841efbe0e843fe91620f629d0b36ec8ff812f9e7c2e11dd293c18d00a72509"} Mar 13 09:47:15 crc kubenswrapper[4841]: I0313 09:47:15.939680 4841 generic.go:334] "Generic (PLEG): container finished" podID="7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece" containerID="56e6f9f290a1ffffdc5fbcccb8c15d8d5e63bd0940a7c6c7fe6949fc7c71c26f" exitCode=0 Mar 13 09:47:15 crc kubenswrapper[4841]: I0313 09:47:15.939908 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tc8v9" event={"ID":"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece","Type":"ContainerDied","Data":"56e6f9f290a1ffffdc5fbcccb8c15d8d5e63bd0940a7c6c7fe6949fc7c71c26f"} Mar 13 09:47:16 crc kubenswrapper[4841]: I0313 09:47:16.950486 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tc8v9" event={"ID":"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece","Type":"ContainerStarted","Data":"b01e462ee9853141a85abcccae939f2ef50101532d8be4baa0d72596349a7ded"} Mar 13 09:47:16 crc kubenswrapper[4841]: I0313 09:47:16.975392 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tc8v9" podStartSLOduration=2.477354795 podStartE2EDuration="4.975368825s" podCreationTimestamp="2026-03-13 09:47:12 +0000 UTC" firstStartedPulling="2026-03-13 09:47:13.904391942 +0000 UTC m=+2116.634292143" lastFinishedPulling="2026-03-13 09:47:16.402405992 +0000 UTC m=+2119.132306173" observedRunningTime="2026-03-13 09:47:16.971782523 +0000 UTC m=+2119.701682724" watchObservedRunningTime="2026-03-13 09:47:16.975368825 +0000 UTC m=+2119.705269026" Mar 13 09:47:22 crc kubenswrapper[4841]: I0313 09:47:22.729312 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tc8v9" Mar 13 09:47:22 crc kubenswrapper[4841]: I0313 09:47:22.730286 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tc8v9" Mar 13 09:47:22 crc kubenswrapper[4841]: I0313 09:47:22.781304 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tc8v9" Mar 13 09:47:23 crc kubenswrapper[4841]: I0313 09:47:23.053407 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tc8v9" Mar 13 09:47:23 crc kubenswrapper[4841]: I0313 09:47:23.111727 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tc8v9"] Mar 13 09:47:25 crc kubenswrapper[4841]: I0313 09:47:25.046400 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tc8v9" podUID="7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece" containerName="registry-server" containerID="cri-o://b01e462ee9853141a85abcccae939f2ef50101532d8be4baa0d72596349a7ded" gracePeriod=2 Mar 13 09:47:25 crc kubenswrapper[4841]: I0313 09:47:25.501401 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tc8v9" Mar 13 09:47:25 crc kubenswrapper[4841]: I0313 09:47:25.667066 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece-utilities\") pod \"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece\" (UID: \"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece\") " Mar 13 09:47:25 crc kubenswrapper[4841]: I0313 09:47:25.667641 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece-catalog-content\") pod \"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece\" (UID: \"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece\") " Mar 13 09:47:25 crc kubenswrapper[4841]: I0313 09:47:25.667757 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvl2w\" (UniqueName: \"kubernetes.io/projected/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece-kube-api-access-cvl2w\") pod \"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece\" (UID: \"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece\") " Mar 13 09:47:25 crc kubenswrapper[4841]: I0313 09:47:25.668235 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece-utilities" (OuterVolumeSpecName: "utilities") pod "7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece" (UID: "7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:47:25 crc kubenswrapper[4841]: I0313 09:47:25.668513 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:47:25 crc kubenswrapper[4841]: I0313 09:47:25.675649 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece-kube-api-access-cvl2w" (OuterVolumeSpecName: "kube-api-access-cvl2w") pod "7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece" (UID: "7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece"). InnerVolumeSpecName "kube-api-access-cvl2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:47:25 crc kubenswrapper[4841]: I0313 09:47:25.747943 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece" (UID: "7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:47:25 crc kubenswrapper[4841]: I0313 09:47:25.770602 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:47:25 crc kubenswrapper[4841]: I0313 09:47:25.770660 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvl2w\" (UniqueName: \"kubernetes.io/projected/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece-kube-api-access-cvl2w\") on node \"crc\" DevicePath \"\"" Mar 13 09:47:26 crc kubenswrapper[4841]: I0313 09:47:26.059940 4841 generic.go:334] "Generic (PLEG): container finished" podID="7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece" containerID="b01e462ee9853141a85abcccae939f2ef50101532d8be4baa0d72596349a7ded" exitCode=0 Mar 13 09:47:26 crc kubenswrapper[4841]: I0313 09:47:26.060010 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tc8v9" event={"ID":"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece","Type":"ContainerDied","Data":"b01e462ee9853141a85abcccae939f2ef50101532d8be4baa0d72596349a7ded"} Mar 13 09:47:26 crc kubenswrapper[4841]: I0313 09:47:26.060024 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tc8v9" Mar 13 09:47:26 crc kubenswrapper[4841]: I0313 09:47:26.060049 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tc8v9" event={"ID":"7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece","Type":"ContainerDied","Data":"11841efbe0e843fe91620f629d0b36ec8ff812f9e7c2e11dd293c18d00a72509"} Mar 13 09:47:26 crc kubenswrapper[4841]: I0313 09:47:26.060071 4841 scope.go:117] "RemoveContainer" containerID="b01e462ee9853141a85abcccae939f2ef50101532d8be4baa0d72596349a7ded" Mar 13 09:47:26 crc kubenswrapper[4841]: I0313 09:47:26.093662 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tc8v9"] Mar 13 09:47:26 crc kubenswrapper[4841]: I0313 09:47:26.098014 4841 scope.go:117] "RemoveContainer" containerID="56e6f9f290a1ffffdc5fbcccb8c15d8d5e63bd0940a7c6c7fe6949fc7c71c26f" Mar 13 09:47:26 crc kubenswrapper[4841]: I0313 09:47:26.107152 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tc8v9"] Mar 13 09:47:26 crc kubenswrapper[4841]: I0313 09:47:26.128951 4841 scope.go:117] "RemoveContainer" containerID="bb656b5ad59fa7a84891dd681a1449b3fb68b272f24a77d6377db05b4a459d8e" Mar 13 09:47:26 crc kubenswrapper[4841]: I0313 09:47:26.201582 4841 scope.go:117] "RemoveContainer" containerID="b01e462ee9853141a85abcccae939f2ef50101532d8be4baa0d72596349a7ded" Mar 13 09:47:26 crc kubenswrapper[4841]: E0313 09:47:26.202828 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b01e462ee9853141a85abcccae939f2ef50101532d8be4baa0d72596349a7ded\": container with ID starting with b01e462ee9853141a85abcccae939f2ef50101532d8be4baa0d72596349a7ded not found: ID does not exist" containerID="b01e462ee9853141a85abcccae939f2ef50101532d8be4baa0d72596349a7ded" Mar 13 09:47:26 crc kubenswrapper[4841]: I0313 09:47:26.202872 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b01e462ee9853141a85abcccae939f2ef50101532d8be4baa0d72596349a7ded"} err="failed to get container status \"b01e462ee9853141a85abcccae939f2ef50101532d8be4baa0d72596349a7ded\": rpc error: code = NotFound desc = could not find container \"b01e462ee9853141a85abcccae939f2ef50101532d8be4baa0d72596349a7ded\": container with ID starting with b01e462ee9853141a85abcccae939f2ef50101532d8be4baa0d72596349a7ded not found: ID does not exist" Mar 13 09:47:26 crc kubenswrapper[4841]: I0313 09:47:26.202899 4841 scope.go:117] "RemoveContainer" containerID="56e6f9f290a1ffffdc5fbcccb8c15d8d5e63bd0940a7c6c7fe6949fc7c71c26f" Mar 13 09:47:26 crc kubenswrapper[4841]: E0313 09:47:26.204059 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56e6f9f290a1ffffdc5fbcccb8c15d8d5e63bd0940a7c6c7fe6949fc7c71c26f\": container with ID starting with 56e6f9f290a1ffffdc5fbcccb8c15d8d5e63bd0940a7c6c7fe6949fc7c71c26f not found: ID does not exist" containerID="56e6f9f290a1ffffdc5fbcccb8c15d8d5e63bd0940a7c6c7fe6949fc7c71c26f" Mar 13 09:47:26 crc kubenswrapper[4841]: I0313 09:47:26.204136 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56e6f9f290a1ffffdc5fbcccb8c15d8d5e63bd0940a7c6c7fe6949fc7c71c26f"} err="failed to get container status \"56e6f9f290a1ffffdc5fbcccb8c15d8d5e63bd0940a7c6c7fe6949fc7c71c26f\": rpc error: code = NotFound desc = could not find container \"56e6f9f290a1ffffdc5fbcccb8c15d8d5e63bd0940a7c6c7fe6949fc7c71c26f\": container with ID starting with 56e6f9f290a1ffffdc5fbcccb8c15d8d5e63bd0940a7c6c7fe6949fc7c71c26f not found: ID does not exist" Mar 13 09:47:26 crc kubenswrapper[4841]: I0313 09:47:26.204178 4841 scope.go:117] "RemoveContainer" containerID="bb656b5ad59fa7a84891dd681a1449b3fb68b272f24a77d6377db05b4a459d8e" Mar 13 09:47:26 crc kubenswrapper[4841]: E0313 09:47:26.204856 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb656b5ad59fa7a84891dd681a1449b3fb68b272f24a77d6377db05b4a459d8e\": container with ID starting with bb656b5ad59fa7a84891dd681a1449b3fb68b272f24a77d6377db05b4a459d8e not found: ID does not exist" containerID="bb656b5ad59fa7a84891dd681a1449b3fb68b272f24a77d6377db05b4a459d8e" Mar 13 09:47:26 crc kubenswrapper[4841]: I0313 09:47:26.204903 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb656b5ad59fa7a84891dd681a1449b3fb68b272f24a77d6377db05b4a459d8e"} err="failed to get container status \"bb656b5ad59fa7a84891dd681a1449b3fb68b272f24a77d6377db05b4a459d8e\": rpc error: code = NotFound desc = could not find container \"bb656b5ad59fa7a84891dd681a1449b3fb68b272f24a77d6377db05b4a459d8e\": container with ID starting with bb656b5ad59fa7a84891dd681a1449b3fb68b272f24a77d6377db05b4a459d8e not found: ID does not exist" Mar 13 09:47:27 crc kubenswrapper[4841]: E0313 09:47:27.635786 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc97298b_a706_488a_9ea1_e90de447c754.slice/crio-conmon-adf0e6f0c0f3e2504209cc3cbbf4d410eaf6540d43f9aa29ebbd6d2e889b1426.scope\": RecentStats: unable to find data in memory cache]" Mar 13 09:47:28 crc kubenswrapper[4841]: I0313 09:47:28.040629 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece" path="/var/lib/kubelet/pods/7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece/volumes" Mar 13 09:47:28 crc kubenswrapper[4841]: I0313 09:47:28.086697 4841 generic.go:334] "Generic (PLEG): container finished" podID="dc97298b-a706-488a-9ea1-e90de447c754" containerID="adf0e6f0c0f3e2504209cc3cbbf4d410eaf6540d43f9aa29ebbd6d2e889b1426" exitCode=0 Mar 13 09:47:28 crc kubenswrapper[4841]: I0313 09:47:28.086782 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" event={"ID":"dc97298b-a706-488a-9ea1-e90de447c754","Type":"ContainerDied","Data":"adf0e6f0c0f3e2504209cc3cbbf4d410eaf6540d43f9aa29ebbd6d2e889b1426"} Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.578749 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.766410 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-neutron-metadata-combined-ca-bundle\") pod \"dc97298b-a706-488a-9ea1-e90de447c754\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.766663 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-nova-metadata-neutron-config-0\") pod \"dc97298b-a706-488a-9ea1-e90de447c754\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.766766 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-ssh-key-openstack-edpm-ipam\") pod \"dc97298b-a706-488a-9ea1-e90de447c754\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.766821 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-neutron-ovn-metadata-agent-neutron-config-0\") pod \"dc97298b-a706-488a-9ea1-e90de447c754\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.766850 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-inventory\") pod \"dc97298b-a706-488a-9ea1-e90de447c754\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.766909 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bjv7\" (UniqueName: \"kubernetes.io/projected/dc97298b-a706-488a-9ea1-e90de447c754-kube-api-access-9bjv7\") pod \"dc97298b-a706-488a-9ea1-e90de447c754\" (UID: \"dc97298b-a706-488a-9ea1-e90de447c754\") " Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.773933 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "dc97298b-a706-488a-9ea1-e90de447c754" (UID: "dc97298b-a706-488a-9ea1-e90de447c754"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.774021 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc97298b-a706-488a-9ea1-e90de447c754-kube-api-access-9bjv7" (OuterVolumeSpecName: "kube-api-access-9bjv7") pod "dc97298b-a706-488a-9ea1-e90de447c754" (UID: "dc97298b-a706-488a-9ea1-e90de447c754"). InnerVolumeSpecName "kube-api-access-9bjv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.798188 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "dc97298b-a706-488a-9ea1-e90de447c754" (UID: "dc97298b-a706-488a-9ea1-e90de447c754"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.800997 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "dc97298b-a706-488a-9ea1-e90de447c754" (UID: "dc97298b-a706-488a-9ea1-e90de447c754"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.804494 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dc97298b-a706-488a-9ea1-e90de447c754" (UID: "dc97298b-a706-488a-9ea1-e90de447c754"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.811434 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-inventory" (OuterVolumeSpecName: "inventory") pod "dc97298b-a706-488a-9ea1-e90de447c754" (UID: "dc97298b-a706-488a-9ea1-e90de447c754"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.869468 4841 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.869544 4841 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.869560 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.869578 4841 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.869599 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc97298b-a706-488a-9ea1-e90de447c754-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 09:47:29 crc kubenswrapper[4841]: I0313 09:47:29.869613 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bjv7\" (UniqueName: \"kubernetes.io/projected/dc97298b-a706-488a-9ea1-e90de447c754-kube-api-access-9bjv7\") on node \"crc\" DevicePath \"\"" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.109175 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" event={"ID":"dc97298b-a706-488a-9ea1-e90de447c754","Type":"ContainerDied","Data":"b9b82f1861acbcc9a2652556b2af68d756b7bf4185f372ef7b7dd78f615ee3ca"} Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.109257 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.109938 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9b82f1861acbcc9a2652556b2af68d756b7bf4185f372ef7b7dd78f615ee3ca" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.207388 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq"] Mar 13 09:47:30 crc kubenswrapper[4841]: E0313 09:47:30.208034 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece" containerName="extract-utilities" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.208065 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece" containerName="extract-utilities" Mar 13 09:47:30 crc kubenswrapper[4841]: E0313 09:47:30.208080 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece" containerName="registry-server" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.208088 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece" containerName="registry-server" Mar 13 09:47:30 crc kubenswrapper[4841]: E0313 09:47:30.208104 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece" containerName="extract-content" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.208111 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece" containerName="extract-content" Mar 13 09:47:30 crc kubenswrapper[4841]: E0313 09:47:30.208128 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc97298b-a706-488a-9ea1-e90de447c754" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.208136 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc97298b-a706-488a-9ea1-e90de447c754" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.208403 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e8f1710-c219-4f3a-8e5e-d50ba2ec1ece" containerName="registry-server" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.208428 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc97298b-a706-488a-9ea1-e90de447c754" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.209378 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.212043 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.215028 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.215289 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.215407 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rv6jt" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.215513 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.227416 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq"] Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.280245 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.280518 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.280601 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.281023 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.281088 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssmdc\" (UniqueName: \"kubernetes.io/projected/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-kube-api-access-ssmdc\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.384373 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.384778 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssmdc\" (UniqueName: \"kubernetes.io/projected/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-kube-api-access-ssmdc\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.384864 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.384950 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.384988 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.389127 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.389139 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.389361 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.396605 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.416981 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssmdc\" (UniqueName: \"kubernetes.io/projected/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-kube-api-access-ssmdc\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:47:30 crc kubenswrapper[4841]: I0313 09:47:30.529129 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:47:31 crc kubenswrapper[4841]: I0313 09:47:31.121075 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq"] Mar 13 09:47:32 crc kubenswrapper[4841]: I0313 09:47:32.137729 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" event={"ID":"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77","Type":"ContainerStarted","Data":"aa62a21cbccec80817b9367765ecaaa3e1be320cb368958f3437c7782855b3d1"} Mar 13 09:47:32 crc kubenswrapper[4841]: I0313 09:47:32.137819 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" event={"ID":"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77","Type":"ContainerStarted","Data":"6a72b391185eef11358abc1115f9de3e946c734f41fe2c2dd82e7197c95e1bd4"} Mar 13 09:47:32 crc kubenswrapper[4841]: I0313 09:47:32.164625 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" podStartSLOduration=1.681027819 podStartE2EDuration="2.164597617s" podCreationTimestamp="2026-03-13 09:47:30 +0000 UTC" firstStartedPulling="2026-03-13 09:47:31.133547922 +0000 UTC m=+2133.863448113" lastFinishedPulling="2026-03-13 09:47:31.61711772 +0000 UTC m=+2134.347017911" observedRunningTime="2026-03-13 09:47:32.155892476 +0000 UTC m=+2134.885792667" watchObservedRunningTime="2026-03-13 09:47:32.164597617 +0000 UTC m=+2134.894497808" Mar 13 09:48:00 crc kubenswrapper[4841]: I0313 09:48:00.147317 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556588-x9hdr"] Mar 13 09:48:00 crc kubenswrapper[4841]: I0313 09:48:00.149219 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556588-x9hdr" Mar 13 09:48:00 crc kubenswrapper[4841]: I0313 09:48:00.165536 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556588-x9hdr"] Mar 13 09:48:00 crc kubenswrapper[4841]: I0313 09:48:00.192507 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:48:00 crc kubenswrapper[4841]: I0313 09:48:00.192837 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:48:00 crc kubenswrapper[4841]: I0313 09:48:00.193076 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:48:00 crc kubenswrapper[4841]: I0313 09:48:00.312092 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrlch\" (UniqueName: \"kubernetes.io/projected/ffe08ae1-505d-46ee-934a-815f0b81fd7b-kube-api-access-lrlch\") pod \"auto-csr-approver-29556588-x9hdr\" (UID: \"ffe08ae1-505d-46ee-934a-815f0b81fd7b\") " pod="openshift-infra/auto-csr-approver-29556588-x9hdr" Mar 13 09:48:00 crc kubenswrapper[4841]: I0313 09:48:00.413589 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrlch\" (UniqueName: \"kubernetes.io/projected/ffe08ae1-505d-46ee-934a-815f0b81fd7b-kube-api-access-lrlch\") pod \"auto-csr-approver-29556588-x9hdr\" (UID: \"ffe08ae1-505d-46ee-934a-815f0b81fd7b\") " pod="openshift-infra/auto-csr-approver-29556588-x9hdr" Mar 13 09:48:00 crc kubenswrapper[4841]: I0313 09:48:00.431175 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrlch\" (UniqueName: \"kubernetes.io/projected/ffe08ae1-505d-46ee-934a-815f0b81fd7b-kube-api-access-lrlch\") pod \"auto-csr-approver-29556588-x9hdr\" (UID: \"ffe08ae1-505d-46ee-934a-815f0b81fd7b\") " pod="openshift-infra/auto-csr-approver-29556588-x9hdr" Mar 13 09:48:00 crc kubenswrapper[4841]: I0313 09:48:00.512206 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556588-x9hdr" Mar 13 09:48:00 crc kubenswrapper[4841]: I0313 09:48:00.960876 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556588-x9hdr"] Mar 13 09:48:01 crc kubenswrapper[4841]: I0313 09:48:01.412737 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556588-x9hdr" event={"ID":"ffe08ae1-505d-46ee-934a-815f0b81fd7b","Type":"ContainerStarted","Data":"cff149cca881d63ac926ad3e53fee0598b047ec63568534da3ee9de432a95bd0"} Mar 13 09:48:02 crc kubenswrapper[4841]: I0313 09:48:02.433760 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556588-x9hdr" event={"ID":"ffe08ae1-505d-46ee-934a-815f0b81fd7b","Type":"ContainerStarted","Data":"0333bc38be1b72015dc29b4c2dff0d5fdea6a77eb0a839107294b58b784d8fb0"} Mar 13 09:48:02 crc kubenswrapper[4841]: I0313 09:48:02.562867 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556588-x9hdr" podStartSLOduration=1.785137465 podStartE2EDuration="2.562844166s" podCreationTimestamp="2026-03-13 09:48:00 +0000 UTC" firstStartedPulling="2026-03-13 09:48:00.961854764 +0000 UTC m=+2163.691754965" lastFinishedPulling="2026-03-13 09:48:01.739561475 +0000 UTC m=+2164.469461666" observedRunningTime="2026-03-13 09:48:02.555720504 +0000 UTC m=+2165.285620695" watchObservedRunningTime="2026-03-13 09:48:02.562844166 +0000 UTC m=+2165.292744357" Mar 13 09:48:03 crc kubenswrapper[4841]: I0313 09:48:03.444805 4841 generic.go:334] "Generic (PLEG): container finished" podID="ffe08ae1-505d-46ee-934a-815f0b81fd7b" containerID="0333bc38be1b72015dc29b4c2dff0d5fdea6a77eb0a839107294b58b784d8fb0" exitCode=0 Mar 13 09:48:03 crc kubenswrapper[4841]: I0313 09:48:03.444848 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556588-x9hdr" event={"ID":"ffe08ae1-505d-46ee-934a-815f0b81fd7b","Type":"ContainerDied","Data":"0333bc38be1b72015dc29b4c2dff0d5fdea6a77eb0a839107294b58b784d8fb0"} Mar 13 09:48:04 crc kubenswrapper[4841]: I0313 09:48:04.786574 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556588-x9hdr" Mar 13 09:48:04 crc kubenswrapper[4841]: I0313 09:48:04.796397 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrlch\" (UniqueName: \"kubernetes.io/projected/ffe08ae1-505d-46ee-934a-815f0b81fd7b-kube-api-access-lrlch\") pod \"ffe08ae1-505d-46ee-934a-815f0b81fd7b\" (UID: \"ffe08ae1-505d-46ee-934a-815f0b81fd7b\") " Mar 13 09:48:04 crc kubenswrapper[4841]: I0313 09:48:04.806622 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffe08ae1-505d-46ee-934a-815f0b81fd7b-kube-api-access-lrlch" (OuterVolumeSpecName: "kube-api-access-lrlch") pod "ffe08ae1-505d-46ee-934a-815f0b81fd7b" (UID: "ffe08ae1-505d-46ee-934a-815f0b81fd7b"). InnerVolumeSpecName "kube-api-access-lrlch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:48:04 crc kubenswrapper[4841]: I0313 09:48:04.897655 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrlch\" (UniqueName: \"kubernetes.io/projected/ffe08ae1-505d-46ee-934a-815f0b81fd7b-kube-api-access-lrlch\") on node \"crc\" DevicePath \"\"" Mar 13 09:48:05 crc kubenswrapper[4841]: I0313 09:48:05.463762 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556588-x9hdr" event={"ID":"ffe08ae1-505d-46ee-934a-815f0b81fd7b","Type":"ContainerDied","Data":"cff149cca881d63ac926ad3e53fee0598b047ec63568534da3ee9de432a95bd0"} Mar 13 09:48:05 crc kubenswrapper[4841]: I0313 09:48:05.463806 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cff149cca881d63ac926ad3e53fee0598b047ec63568534da3ee9de432a95bd0" Mar 13 09:48:05 crc kubenswrapper[4841]: I0313 09:48:05.463853 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556588-x9hdr" Mar 13 09:48:05 crc kubenswrapper[4841]: I0313 09:48:05.853441 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556582-lpb6v"] Mar 13 09:48:05 crc kubenswrapper[4841]: I0313 09:48:05.862345 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556582-lpb6v"] Mar 13 09:48:06 crc kubenswrapper[4841]: I0313 09:48:06.005356 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="effcd983-b6c7-4ea1-8aae-7cb08d290ecd" path="/var/lib/kubelet/pods/effcd983-b6c7-4ea1-8aae-7cb08d290ecd/volumes" Mar 13 09:49:04 crc kubenswrapper[4841]: I0313 09:49:04.347880 4841 scope.go:117] "RemoveContainer" containerID="36b951536fffacc6df51e4d6d2cb76edf371f3835583124009f2b487f01abf74" Mar 13 09:49:04 crc kubenswrapper[4841]: I0313 09:49:04.407374 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:49:04 crc kubenswrapper[4841]: I0313 09:49:04.407470 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:49:34 crc kubenswrapper[4841]: I0313 09:49:34.407441 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:49:34 crc kubenswrapper[4841]: I0313 09:49:34.408033 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:50:00 crc kubenswrapper[4841]: I0313 09:50:00.158120 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556590-kkm25"] Mar 13 09:50:00 crc kubenswrapper[4841]: E0313 09:50:00.159078 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe08ae1-505d-46ee-934a-815f0b81fd7b" containerName="oc" Mar 13 09:50:00 crc kubenswrapper[4841]: I0313 09:50:00.159092 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe08ae1-505d-46ee-934a-815f0b81fd7b" containerName="oc" Mar 13 09:50:00 crc kubenswrapper[4841]: I0313 09:50:00.159319 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffe08ae1-505d-46ee-934a-815f0b81fd7b" containerName="oc" Mar 13 09:50:00 crc kubenswrapper[4841]: I0313 09:50:00.159936 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556590-kkm25" Mar 13 09:50:00 crc kubenswrapper[4841]: I0313 09:50:00.166981 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:50:00 crc kubenswrapper[4841]: I0313 09:50:00.167837 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:50:00 crc kubenswrapper[4841]: I0313 09:50:00.168179 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556590-kkm25"] Mar 13 09:50:00 crc kubenswrapper[4841]: I0313 09:50:00.169831 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:50:00 crc kubenswrapper[4841]: I0313 09:50:00.328535 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cng2t\" (UniqueName: \"kubernetes.io/projected/21765abc-b94f-4650-8aaf-3a917c3f655d-kube-api-access-cng2t\") pod \"auto-csr-approver-29556590-kkm25\" (UID: \"21765abc-b94f-4650-8aaf-3a917c3f655d\") " pod="openshift-infra/auto-csr-approver-29556590-kkm25" Mar 13 09:50:00 crc kubenswrapper[4841]: I0313 09:50:00.431628 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cng2t\" (UniqueName: \"kubernetes.io/projected/21765abc-b94f-4650-8aaf-3a917c3f655d-kube-api-access-cng2t\") pod \"auto-csr-approver-29556590-kkm25\" (UID: \"21765abc-b94f-4650-8aaf-3a917c3f655d\") " pod="openshift-infra/auto-csr-approver-29556590-kkm25" Mar 13 09:50:00 crc kubenswrapper[4841]: I0313 09:50:00.453335 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cng2t\" (UniqueName: \"kubernetes.io/projected/21765abc-b94f-4650-8aaf-3a917c3f655d-kube-api-access-cng2t\") pod \"auto-csr-approver-29556590-kkm25\" (UID: \"21765abc-b94f-4650-8aaf-3a917c3f655d\") " pod="openshift-infra/auto-csr-approver-29556590-kkm25" Mar 13 09:50:00 crc kubenswrapper[4841]: I0313 09:50:00.478564 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556590-kkm25" Mar 13 09:50:00 crc kubenswrapper[4841]: W0313 09:50:00.941987 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21765abc_b94f_4650_8aaf_3a917c3f655d.slice/crio-8c347083b1f58ab9dda866b4f138356a90d3813829fd2ee48c34c3b53fdee085 WatchSource:0}: Error finding container 8c347083b1f58ab9dda866b4f138356a90d3813829fd2ee48c34c3b53fdee085: Status 404 returned error can't find the container with id 8c347083b1f58ab9dda866b4f138356a90d3813829fd2ee48c34c3b53fdee085 Mar 13 09:50:00 crc kubenswrapper[4841]: I0313 09:50:00.946984 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 09:50:00 crc kubenswrapper[4841]: I0313 09:50:00.949467 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556590-kkm25"] Mar 13 09:50:01 crc kubenswrapper[4841]: I0313 09:50:01.546406 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556590-kkm25" event={"ID":"21765abc-b94f-4650-8aaf-3a917c3f655d","Type":"ContainerStarted","Data":"8c347083b1f58ab9dda866b4f138356a90d3813829fd2ee48c34c3b53fdee085"} Mar 13 09:50:03 crc kubenswrapper[4841]: I0313 09:50:03.573858 4841 generic.go:334] "Generic (PLEG): container finished" podID="21765abc-b94f-4650-8aaf-3a917c3f655d" containerID="365c91209487e7b880cac1843eff8258eea0bd4895e662608043e84e222444c3" exitCode=0 Mar 13 09:50:03 crc kubenswrapper[4841]: I0313 09:50:03.573924 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556590-kkm25" event={"ID":"21765abc-b94f-4650-8aaf-3a917c3f655d","Type":"ContainerDied","Data":"365c91209487e7b880cac1843eff8258eea0bd4895e662608043e84e222444c3"} Mar 13 09:50:04 crc kubenswrapper[4841]: I0313 09:50:04.407631 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:50:04 crc kubenswrapper[4841]: I0313 09:50:04.408025 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:50:04 crc kubenswrapper[4841]: I0313 09:50:04.408088 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:50:04 crc kubenswrapper[4841]: I0313 09:50:04.409197 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5"} pod="openshift-machine-config-operator/machine-config-daemon-h227v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 09:50:04 crc kubenswrapper[4841]: I0313 09:50:04.409367 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" containerID="cri-o://c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" gracePeriod=600 Mar 13 09:50:04 crc kubenswrapper[4841]: E0313 09:50:04.539731 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:50:04 crc kubenswrapper[4841]: I0313 09:50:04.598783 4841 generic.go:334] "Generic (PLEG): container finished" podID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" exitCode=0 Mar 13 09:50:04 crc kubenswrapper[4841]: I0313 09:50:04.598881 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerDied","Data":"c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5"} Mar 13 09:50:04 crc kubenswrapper[4841]: I0313 09:50:04.598942 4841 scope.go:117] "RemoveContainer" containerID="fcd17e9cebe8ee5c7b1749619cc08d15d1f1b747829783b5902c03af4c6172bc" Mar 13 09:50:04 crc kubenswrapper[4841]: I0313 09:50:04.599603 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:50:04 crc kubenswrapper[4841]: E0313 09:50:04.599893 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:50:04 crc kubenswrapper[4841]: I0313 09:50:04.962982 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556590-kkm25" Mar 13 09:50:05 crc kubenswrapper[4841]: I0313 09:50:05.144393 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cng2t\" (UniqueName: \"kubernetes.io/projected/21765abc-b94f-4650-8aaf-3a917c3f655d-kube-api-access-cng2t\") pod \"21765abc-b94f-4650-8aaf-3a917c3f655d\" (UID: \"21765abc-b94f-4650-8aaf-3a917c3f655d\") " Mar 13 09:50:05 crc kubenswrapper[4841]: I0313 09:50:05.154837 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21765abc-b94f-4650-8aaf-3a917c3f655d-kube-api-access-cng2t" (OuterVolumeSpecName: "kube-api-access-cng2t") pod "21765abc-b94f-4650-8aaf-3a917c3f655d" (UID: "21765abc-b94f-4650-8aaf-3a917c3f655d"). InnerVolumeSpecName "kube-api-access-cng2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:50:05 crc kubenswrapper[4841]: I0313 09:50:05.246936 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cng2t\" (UniqueName: \"kubernetes.io/projected/21765abc-b94f-4650-8aaf-3a917c3f655d-kube-api-access-cng2t\") on node \"crc\" DevicePath \"\"" Mar 13 09:50:05 crc kubenswrapper[4841]: I0313 09:50:05.610981 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556590-kkm25" event={"ID":"21765abc-b94f-4650-8aaf-3a917c3f655d","Type":"ContainerDied","Data":"8c347083b1f58ab9dda866b4f138356a90d3813829fd2ee48c34c3b53fdee085"} Mar 13 09:50:05 crc kubenswrapper[4841]: I0313 09:50:05.611045 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c347083b1f58ab9dda866b4f138356a90d3813829fd2ee48c34c3b53fdee085" Mar 13 09:50:05 crc kubenswrapper[4841]: I0313 09:50:05.612494 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556590-kkm25" Mar 13 09:50:06 crc kubenswrapper[4841]: I0313 09:50:06.066936 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556584-bs22k"] Mar 13 09:50:06 crc kubenswrapper[4841]: I0313 09:50:06.102395 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556584-bs22k"] Mar 13 09:50:08 crc kubenswrapper[4841]: I0313 09:50:08.011190 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f00a74f-0fc0-448b-94db-5715aa58a74f" path="/var/lib/kubelet/pods/0f00a74f-0fc0-448b-94db-5715aa58a74f/volumes" Mar 13 09:50:15 crc kubenswrapper[4841]: I0313 09:50:15.995506 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:50:15 crc kubenswrapper[4841]: E0313 09:50:15.996441 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:50:20 crc kubenswrapper[4841]: I0313 09:50:20.443920 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w695x"] Mar 13 09:50:20 crc kubenswrapper[4841]: E0313 09:50:20.445255 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21765abc-b94f-4650-8aaf-3a917c3f655d" containerName="oc" Mar 13 09:50:20 crc kubenswrapper[4841]: I0313 09:50:20.445298 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="21765abc-b94f-4650-8aaf-3a917c3f655d" containerName="oc" Mar 13 09:50:20 crc kubenswrapper[4841]: I0313 09:50:20.446025 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="21765abc-b94f-4650-8aaf-3a917c3f655d" containerName="oc" Mar 13 09:50:20 crc kubenswrapper[4841]: I0313 09:50:20.449359 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w695x" Mar 13 09:50:20 crc kubenswrapper[4841]: I0313 09:50:20.465349 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w695x"] Mar 13 09:50:20 crc kubenswrapper[4841]: I0313 09:50:20.530215 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de78e86a-9d62-4bae-9a54-17b28c37d31d-catalog-content\") pod \"redhat-marketplace-w695x\" (UID: \"de78e86a-9d62-4bae-9a54-17b28c37d31d\") " pod="openshift-marketplace/redhat-marketplace-w695x" Mar 13 09:50:20 crc kubenswrapper[4841]: I0313 09:50:20.530296 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de78e86a-9d62-4bae-9a54-17b28c37d31d-utilities\") pod \"redhat-marketplace-w695x\" (UID: \"de78e86a-9d62-4bae-9a54-17b28c37d31d\") " pod="openshift-marketplace/redhat-marketplace-w695x" Mar 13 09:50:20 crc kubenswrapper[4841]: I0313 09:50:20.530331 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jcdn\" (UniqueName: \"kubernetes.io/projected/de78e86a-9d62-4bae-9a54-17b28c37d31d-kube-api-access-7jcdn\") pod \"redhat-marketplace-w695x\" (UID: \"de78e86a-9d62-4bae-9a54-17b28c37d31d\") " pod="openshift-marketplace/redhat-marketplace-w695x" Mar 13 09:50:20 crc kubenswrapper[4841]: I0313 09:50:20.632347 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de78e86a-9d62-4bae-9a54-17b28c37d31d-catalog-content\") pod \"redhat-marketplace-w695x\" (UID: \"de78e86a-9d62-4bae-9a54-17b28c37d31d\") " pod="openshift-marketplace/redhat-marketplace-w695x" Mar 13 09:50:20 crc kubenswrapper[4841]: I0313 09:50:20.632423 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de78e86a-9d62-4bae-9a54-17b28c37d31d-utilities\") pod \"redhat-marketplace-w695x\" (UID: \"de78e86a-9d62-4bae-9a54-17b28c37d31d\") " pod="openshift-marketplace/redhat-marketplace-w695x" Mar 13 09:50:20 crc kubenswrapper[4841]: I0313 09:50:20.632456 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jcdn\" (UniqueName: \"kubernetes.io/projected/de78e86a-9d62-4bae-9a54-17b28c37d31d-kube-api-access-7jcdn\") pod \"redhat-marketplace-w695x\" (UID: \"de78e86a-9d62-4bae-9a54-17b28c37d31d\") " pod="openshift-marketplace/redhat-marketplace-w695x" Mar 13 09:50:20 crc kubenswrapper[4841]: I0313 09:50:20.632842 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de78e86a-9d62-4bae-9a54-17b28c37d31d-utilities\") pod \"redhat-marketplace-w695x\" (UID: \"de78e86a-9d62-4bae-9a54-17b28c37d31d\") " pod="openshift-marketplace/redhat-marketplace-w695x" Mar 13 09:50:20 crc kubenswrapper[4841]: I0313 09:50:20.632928 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de78e86a-9d62-4bae-9a54-17b28c37d31d-catalog-content\") pod \"redhat-marketplace-w695x\" (UID: \"de78e86a-9d62-4bae-9a54-17b28c37d31d\") " pod="openshift-marketplace/redhat-marketplace-w695x" Mar 13 09:50:20 crc kubenswrapper[4841]: I0313 09:50:20.652245 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jcdn\" (UniqueName: \"kubernetes.io/projected/de78e86a-9d62-4bae-9a54-17b28c37d31d-kube-api-access-7jcdn\") pod \"redhat-marketplace-w695x\" (UID: \"de78e86a-9d62-4bae-9a54-17b28c37d31d\") " pod="openshift-marketplace/redhat-marketplace-w695x" Mar 13 09:50:20 crc kubenswrapper[4841]: I0313 09:50:20.791408 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w695x" Mar 13 09:50:21 crc kubenswrapper[4841]: I0313 09:50:21.323364 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w695x"] Mar 13 09:50:21 crc kubenswrapper[4841]: I0313 09:50:21.764147 4841 generic.go:334] "Generic (PLEG): container finished" podID="de78e86a-9d62-4bae-9a54-17b28c37d31d" containerID="10c019ed420f525b82b1dc655eb1018fae41f997a7afdc9a7683134447f3124e" exitCode=0 Mar 13 09:50:21 crc kubenswrapper[4841]: I0313 09:50:21.764218 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w695x" event={"ID":"de78e86a-9d62-4bae-9a54-17b28c37d31d","Type":"ContainerDied","Data":"10c019ed420f525b82b1dc655eb1018fae41f997a7afdc9a7683134447f3124e"} Mar 13 09:50:21 crc kubenswrapper[4841]: I0313 09:50:21.764450 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w695x" event={"ID":"de78e86a-9d62-4bae-9a54-17b28c37d31d","Type":"ContainerStarted","Data":"461369370b773a5dccc6a0fa9e56cd40ab04ab547610c906fc0cbb373d5535e9"} Mar 13 09:50:22 crc kubenswrapper[4841]: I0313 09:50:22.778463 4841 generic.go:334] "Generic (PLEG): container finished" podID="de78e86a-9d62-4bae-9a54-17b28c37d31d" containerID="d36da4ae6b9a497ead44d241aaf3a66f7d81e678c9740db7576e418a804a0645" exitCode=0 Mar 13 09:50:22 crc kubenswrapper[4841]: I0313 09:50:22.778566 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w695x" event={"ID":"de78e86a-9d62-4bae-9a54-17b28c37d31d","Type":"ContainerDied","Data":"d36da4ae6b9a497ead44d241aaf3a66f7d81e678c9740db7576e418a804a0645"} Mar 13 09:50:23 crc kubenswrapper[4841]: I0313 09:50:23.791501 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w695x" event={"ID":"de78e86a-9d62-4bae-9a54-17b28c37d31d","Type":"ContainerStarted","Data":"63f418913d2d1fc6db3698fe947192b175b3c9bdae88e4b84d271c5ce055c265"} Mar 13 09:50:23 crc kubenswrapper[4841]: I0313 09:50:23.829408 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w695x" podStartSLOduration=2.185430769 podStartE2EDuration="3.829379033s" podCreationTimestamp="2026-03-13 09:50:20 +0000 UTC" firstStartedPulling="2026-03-13 09:50:21.766207644 +0000 UTC m=+2304.496107835" lastFinishedPulling="2026-03-13 09:50:23.410155878 +0000 UTC m=+2306.140056099" observedRunningTime="2026-03-13 09:50:23.81257813 +0000 UTC m=+2306.542478341" watchObservedRunningTime="2026-03-13 09:50:23.829379033 +0000 UTC m=+2306.559279264" Mar 13 09:50:28 crc kubenswrapper[4841]: I0313 09:50:28.002373 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:50:28 crc kubenswrapper[4841]: E0313 09:50:28.003222 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:50:30 crc kubenswrapper[4841]: I0313 09:50:30.792325 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w695x" Mar 13 09:50:30 crc kubenswrapper[4841]: I0313 09:50:30.793076 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w695x" Mar 13 09:50:30 crc kubenswrapper[4841]: I0313 09:50:30.835132 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w695x" Mar 13 09:50:30 crc kubenswrapper[4841]: I0313 09:50:30.903170 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w695x" Mar 13 09:50:31 crc kubenswrapper[4841]: I0313 09:50:31.074301 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w695x"] Mar 13 09:50:32 crc kubenswrapper[4841]: I0313 09:50:32.872516 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w695x" podUID="de78e86a-9d62-4bae-9a54-17b28c37d31d" containerName="registry-server" containerID="cri-o://63f418913d2d1fc6db3698fe947192b175b3c9bdae88e4b84d271c5ce055c265" gracePeriod=2 Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.365907 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w695x" Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.431049 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de78e86a-9d62-4bae-9a54-17b28c37d31d-utilities\") pod \"de78e86a-9d62-4bae-9a54-17b28c37d31d\" (UID: \"de78e86a-9d62-4bae-9a54-17b28c37d31d\") " Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.431720 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jcdn\" (UniqueName: \"kubernetes.io/projected/de78e86a-9d62-4bae-9a54-17b28c37d31d-kube-api-access-7jcdn\") pod \"de78e86a-9d62-4bae-9a54-17b28c37d31d\" (UID: \"de78e86a-9d62-4bae-9a54-17b28c37d31d\") " Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.431835 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de78e86a-9d62-4bae-9a54-17b28c37d31d-catalog-content\") pod \"de78e86a-9d62-4bae-9a54-17b28c37d31d\" (UID: \"de78e86a-9d62-4bae-9a54-17b28c37d31d\") " Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.432111 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de78e86a-9d62-4bae-9a54-17b28c37d31d-utilities" (OuterVolumeSpecName: "utilities") pod "de78e86a-9d62-4bae-9a54-17b28c37d31d" (UID: "de78e86a-9d62-4bae-9a54-17b28c37d31d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.432369 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de78e86a-9d62-4bae-9a54-17b28c37d31d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.440314 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de78e86a-9d62-4bae-9a54-17b28c37d31d-kube-api-access-7jcdn" (OuterVolumeSpecName: "kube-api-access-7jcdn") pod "de78e86a-9d62-4bae-9a54-17b28c37d31d" (UID: "de78e86a-9d62-4bae-9a54-17b28c37d31d"). InnerVolumeSpecName "kube-api-access-7jcdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.467690 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de78e86a-9d62-4bae-9a54-17b28c37d31d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de78e86a-9d62-4bae-9a54-17b28c37d31d" (UID: "de78e86a-9d62-4bae-9a54-17b28c37d31d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.534323 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jcdn\" (UniqueName: \"kubernetes.io/projected/de78e86a-9d62-4bae-9a54-17b28c37d31d-kube-api-access-7jcdn\") on node \"crc\" DevicePath \"\"" Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.534382 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de78e86a-9d62-4bae-9a54-17b28c37d31d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.887239 4841 generic.go:334] "Generic (PLEG): container finished" podID="de78e86a-9d62-4bae-9a54-17b28c37d31d" containerID="63f418913d2d1fc6db3698fe947192b175b3c9bdae88e4b84d271c5ce055c265" exitCode=0 Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.887327 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w695x" event={"ID":"de78e86a-9d62-4bae-9a54-17b28c37d31d","Type":"ContainerDied","Data":"63f418913d2d1fc6db3698fe947192b175b3c9bdae88e4b84d271c5ce055c265"} Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.887397 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w695x" Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.887438 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w695x" event={"ID":"de78e86a-9d62-4bae-9a54-17b28c37d31d","Type":"ContainerDied","Data":"461369370b773a5dccc6a0fa9e56cd40ab04ab547610c906fc0cbb373d5535e9"} Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.887470 4841 scope.go:117] "RemoveContainer" containerID="63f418913d2d1fc6db3698fe947192b175b3c9bdae88e4b84d271c5ce055c265" Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.912474 4841 scope.go:117] "RemoveContainer" containerID="d36da4ae6b9a497ead44d241aaf3a66f7d81e678c9740db7576e418a804a0645" Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.927595 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w695x"] Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.935652 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w695x"] Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.962652 4841 scope.go:117] "RemoveContainer" containerID="10c019ed420f525b82b1dc655eb1018fae41f997a7afdc9a7683134447f3124e" Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.986533 4841 scope.go:117] "RemoveContainer" containerID="63f418913d2d1fc6db3698fe947192b175b3c9bdae88e4b84d271c5ce055c265" Mar 13 09:50:33 crc kubenswrapper[4841]: E0313 09:50:33.986922 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63f418913d2d1fc6db3698fe947192b175b3c9bdae88e4b84d271c5ce055c265\": container with ID starting with 63f418913d2d1fc6db3698fe947192b175b3c9bdae88e4b84d271c5ce055c265 not found: ID does not exist" containerID="63f418913d2d1fc6db3698fe947192b175b3c9bdae88e4b84d271c5ce055c265" Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.986970 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63f418913d2d1fc6db3698fe947192b175b3c9bdae88e4b84d271c5ce055c265"} err="failed to get container status \"63f418913d2d1fc6db3698fe947192b175b3c9bdae88e4b84d271c5ce055c265\": rpc error: code = NotFound desc = could not find container \"63f418913d2d1fc6db3698fe947192b175b3c9bdae88e4b84d271c5ce055c265\": container with ID starting with 63f418913d2d1fc6db3698fe947192b175b3c9bdae88e4b84d271c5ce055c265 not found: ID does not exist" Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.987002 4841 scope.go:117] "RemoveContainer" containerID="d36da4ae6b9a497ead44d241aaf3a66f7d81e678c9740db7576e418a804a0645" Mar 13 09:50:33 crc kubenswrapper[4841]: E0313 09:50:33.987233 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d36da4ae6b9a497ead44d241aaf3a66f7d81e678c9740db7576e418a804a0645\": container with ID starting with d36da4ae6b9a497ead44d241aaf3a66f7d81e678c9740db7576e418a804a0645 not found: ID does not exist" containerID="d36da4ae6b9a497ead44d241aaf3a66f7d81e678c9740db7576e418a804a0645" Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.987283 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d36da4ae6b9a497ead44d241aaf3a66f7d81e678c9740db7576e418a804a0645"} err="failed to get container status \"d36da4ae6b9a497ead44d241aaf3a66f7d81e678c9740db7576e418a804a0645\": rpc error: code = NotFound desc = could not find container \"d36da4ae6b9a497ead44d241aaf3a66f7d81e678c9740db7576e418a804a0645\": container with ID starting with d36da4ae6b9a497ead44d241aaf3a66f7d81e678c9740db7576e418a804a0645 not found: ID does not exist" Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.987301 4841 scope.go:117] "RemoveContainer" containerID="10c019ed420f525b82b1dc655eb1018fae41f997a7afdc9a7683134447f3124e" Mar 13 09:50:33 crc kubenswrapper[4841]: E0313 09:50:33.987626 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10c019ed420f525b82b1dc655eb1018fae41f997a7afdc9a7683134447f3124e\": container with ID starting with 10c019ed420f525b82b1dc655eb1018fae41f997a7afdc9a7683134447f3124e not found: ID does not exist" containerID="10c019ed420f525b82b1dc655eb1018fae41f997a7afdc9a7683134447f3124e" Mar 13 09:50:33 crc kubenswrapper[4841]: I0313 09:50:33.987674 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c019ed420f525b82b1dc655eb1018fae41f997a7afdc9a7683134447f3124e"} err="failed to get container status \"10c019ed420f525b82b1dc655eb1018fae41f997a7afdc9a7683134447f3124e\": rpc error: code = NotFound desc = could not find container \"10c019ed420f525b82b1dc655eb1018fae41f997a7afdc9a7683134447f3124e\": container with ID starting with 10c019ed420f525b82b1dc655eb1018fae41f997a7afdc9a7683134447f3124e not found: ID does not exist" Mar 13 09:50:34 crc kubenswrapper[4841]: I0313 09:50:34.013944 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de78e86a-9d62-4bae-9a54-17b28c37d31d" path="/var/lib/kubelet/pods/de78e86a-9d62-4bae-9a54-17b28c37d31d/volumes" Mar 13 09:50:38 crc kubenswrapper[4841]: I0313 09:50:38.995144 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:50:38 crc kubenswrapper[4841]: E0313 09:50:38.996052 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:50:49 crc kubenswrapper[4841]: I0313 09:50:49.996123 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:50:49 crc kubenswrapper[4841]: E0313 09:50:49.997341 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:51:00 crc kubenswrapper[4841]: I0313 09:51:00.995228 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:51:00 crc kubenswrapper[4841]: E0313 09:51:00.996185 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:51:02 crc kubenswrapper[4841]: I0313 09:51:02.183353 4841 generic.go:334] "Generic (PLEG): container finished" podID="4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77" containerID="aa62a21cbccec80817b9367765ecaaa3e1be320cb368958f3437c7782855b3d1" exitCode=0 Mar 13 09:51:02 crc kubenswrapper[4841]: I0313 09:51:02.183444 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" event={"ID":"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77","Type":"ContainerDied","Data":"aa62a21cbccec80817b9367765ecaaa3e1be320cb368958f3437c7782855b3d1"} Mar 13 09:51:03 crc kubenswrapper[4841]: I0313 09:51:03.624480 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:51:03 crc kubenswrapper[4841]: I0313 09:51:03.770192 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-libvirt-combined-ca-bundle\") pod \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " Mar 13 09:51:03 crc kubenswrapper[4841]: I0313 09:51:03.770417 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-libvirt-secret-0\") pod \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " Mar 13 09:51:03 crc kubenswrapper[4841]: I0313 09:51:03.770474 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssmdc\" (UniqueName: \"kubernetes.io/projected/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-kube-api-access-ssmdc\") pod \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " Mar 13 09:51:03 crc kubenswrapper[4841]: I0313 09:51:03.770502 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-inventory\") pod \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " Mar 13 09:51:03 crc kubenswrapper[4841]: I0313 09:51:03.770579 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-ssh-key-openstack-edpm-ipam\") pod \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\" (UID: \"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77\") " Mar 13 09:51:03 crc kubenswrapper[4841]: I0313 09:51:03.775716 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-kube-api-access-ssmdc" (OuterVolumeSpecName: "kube-api-access-ssmdc") pod "4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77" (UID: "4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77"). InnerVolumeSpecName "kube-api-access-ssmdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:51:03 crc kubenswrapper[4841]: I0313 09:51:03.776380 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77" (UID: "4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:51:03 crc kubenswrapper[4841]: I0313 09:51:03.795773 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-inventory" (OuterVolumeSpecName: "inventory") pod "4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77" (UID: "4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:51:03 crc kubenswrapper[4841]: I0313 09:51:03.798533 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77" (UID: "4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:51:03 crc kubenswrapper[4841]: I0313 09:51:03.819532 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77" (UID: "4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:51:03 crc kubenswrapper[4841]: I0313 09:51:03.872918 4841 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:51:03 crc kubenswrapper[4841]: I0313 09:51:03.873119 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssmdc\" (UniqueName: \"kubernetes.io/projected/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-kube-api-access-ssmdc\") on node \"crc\" DevicePath \"\"" Mar 13 09:51:03 crc kubenswrapper[4841]: I0313 09:51:03.873195 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 09:51:03 crc kubenswrapper[4841]: I0313 09:51:03.873313 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 09:51:03 crc kubenswrapper[4841]: I0313 09:51:03.873480 4841 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.203421 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" event={"ID":"4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77","Type":"ContainerDied","Data":"6a72b391185eef11358abc1115f9de3e946c734f41fe2c2dd82e7197c95e1bd4"} Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.203466 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a72b391185eef11358abc1115f9de3e946c734f41fe2c2dd82e7197c95e1bd4" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.203499 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.325214 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp"] Mar 13 09:51:04 crc kubenswrapper[4841]: E0313 09:51:04.325946 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de78e86a-9d62-4bae-9a54-17b28c37d31d" containerName="registry-server" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.326044 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="de78e86a-9d62-4bae-9a54-17b28c37d31d" containerName="registry-server" Mar 13 09:51:04 crc kubenswrapper[4841]: E0313 09:51:04.326168 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.326224 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 09:51:04 crc kubenswrapper[4841]: E0313 09:51:04.326299 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de78e86a-9d62-4bae-9a54-17b28c37d31d" containerName="extract-utilities" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.326352 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="de78e86a-9d62-4bae-9a54-17b28c37d31d" containerName="extract-utilities" Mar 13 09:51:04 crc kubenswrapper[4841]: E0313 09:51:04.326418 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de78e86a-9d62-4bae-9a54-17b28c37d31d" containerName="extract-content" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.326466 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="de78e86a-9d62-4bae-9a54-17b28c37d31d" containerName="extract-content" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.326681 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.326740 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="de78e86a-9d62-4bae-9a54-17b28c37d31d" containerName="registry-server" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.327454 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.329876 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.330417 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.331856 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rv6jt" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.332048 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.332229 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.332449 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.332599 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.350058 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp"] Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.471223 4841 scope.go:117] "RemoveContainer" containerID="ab118bd4ba3f72842a2c7299e70a4c4adedd239851df247e9ae13d32156e4267" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.488426 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.488986 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.489063 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.489179 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.489390 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.489476 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.489663 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.489789 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.489840 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj78x\" (UniqueName: \"kubernetes.io/projected/7f7ae341-a1c6-49f6-825c-4c47b14141f4-kube-api-access-kj78x\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.489906 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.489993 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.592388 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.592485 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj78x\" (UniqueName: \"kubernetes.io/projected/7f7ae341-a1c6-49f6-825c-4c47b14141f4-kube-api-access-kj78x\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.592571 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.592620 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.592673 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.592708 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.592747 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.592799 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.592857 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.592902 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.593008 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.595079 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.596734 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.596920 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.597425 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.597940 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.598036 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.598077 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.598515 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.599390 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.600644 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.615177 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj78x\" (UniqueName: \"kubernetes.io/projected/7f7ae341-a1c6-49f6-825c-4c47b14141f4-kube-api-access-kj78x\") pod \"nova-edpm-deployment-openstack-edpm-ipam-v22pp\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:04 crc kubenswrapper[4841]: I0313 09:51:04.659993 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:51:05 crc kubenswrapper[4841]: I0313 09:51:05.183309 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp"] Mar 13 09:51:05 crc kubenswrapper[4841]: I0313 09:51:05.215987 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" event={"ID":"7f7ae341-a1c6-49f6-825c-4c47b14141f4","Type":"ContainerStarted","Data":"983defb13459b5b2d9959ee6491199e6cb127d6624c11a8730d9afc001a20c31"} Mar 13 09:51:06 crc kubenswrapper[4841]: I0313 09:51:06.236045 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" event={"ID":"7f7ae341-a1c6-49f6-825c-4c47b14141f4","Type":"ContainerStarted","Data":"48006d2667902eaaa6755ccfae24a7e3241e606f5c93f9b7a7abd848abfa6aa2"} Mar 13 09:51:06 crc kubenswrapper[4841]: I0313 09:51:06.264657 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" podStartSLOduration=1.850740052 podStartE2EDuration="2.264635608s" podCreationTimestamp="2026-03-13 09:51:04 +0000 UTC" firstStartedPulling="2026-03-13 09:51:05.188505975 +0000 UTC m=+2347.918406176" lastFinishedPulling="2026-03-13 09:51:05.602401531 +0000 UTC m=+2348.332301732" observedRunningTime="2026-03-13 09:51:06.253961959 +0000 UTC m=+2348.983862150" watchObservedRunningTime="2026-03-13 09:51:06.264635608 +0000 UTC m=+2348.994535809" Mar 13 09:51:12 crc kubenswrapper[4841]: I0313 09:51:12.994959 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:51:12 crc kubenswrapper[4841]: E0313 09:51:12.995667 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:51:24 crc kubenswrapper[4841]: I0313 09:51:24.996022 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:51:24 crc kubenswrapper[4841]: E0313 09:51:24.996851 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:51:38 crc kubenswrapper[4841]: I0313 09:51:38.995456 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:51:38 crc kubenswrapper[4841]: E0313 09:51:38.997454 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:51:49 crc kubenswrapper[4841]: I0313 09:51:49.994526 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:51:49 crc kubenswrapper[4841]: E0313 09:51:49.995339 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:52:00 crc kubenswrapper[4841]: I0313 09:52:00.150798 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556592-25g9w"] Mar 13 09:52:00 crc kubenswrapper[4841]: I0313 09:52:00.153290 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556592-25g9w" Mar 13 09:52:00 crc kubenswrapper[4841]: I0313 09:52:00.155719 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:52:00 crc kubenswrapper[4841]: I0313 09:52:00.156514 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:52:00 crc kubenswrapper[4841]: I0313 09:52:00.160542 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:52:00 crc kubenswrapper[4841]: I0313 09:52:00.165375 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556592-25g9w"] Mar 13 09:52:00 crc kubenswrapper[4841]: I0313 09:52:00.247831 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzgx9\" (UniqueName: \"kubernetes.io/projected/a0591f1c-f727-44d8-96d7-0c0ce0c06d8b-kube-api-access-lzgx9\") pod \"auto-csr-approver-29556592-25g9w\" (UID: \"a0591f1c-f727-44d8-96d7-0c0ce0c06d8b\") " pod="openshift-infra/auto-csr-approver-29556592-25g9w" Mar 13 09:52:00 crc kubenswrapper[4841]: I0313 09:52:00.349191 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzgx9\" (UniqueName: \"kubernetes.io/projected/a0591f1c-f727-44d8-96d7-0c0ce0c06d8b-kube-api-access-lzgx9\") pod \"auto-csr-approver-29556592-25g9w\" (UID: \"a0591f1c-f727-44d8-96d7-0c0ce0c06d8b\") " pod="openshift-infra/auto-csr-approver-29556592-25g9w" Mar 13 09:52:00 crc kubenswrapper[4841]: I0313 09:52:00.371407 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzgx9\" (UniqueName: \"kubernetes.io/projected/a0591f1c-f727-44d8-96d7-0c0ce0c06d8b-kube-api-access-lzgx9\") pod \"auto-csr-approver-29556592-25g9w\" (UID: \"a0591f1c-f727-44d8-96d7-0c0ce0c06d8b\") " pod="openshift-infra/auto-csr-approver-29556592-25g9w" Mar 13 09:52:00 crc kubenswrapper[4841]: I0313 09:52:00.486484 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556592-25g9w" Mar 13 09:52:00 crc kubenswrapper[4841]: I0313 09:52:00.920575 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556592-25g9w"] Mar 13 09:52:00 crc kubenswrapper[4841]: W0313 09:52:00.925799 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0591f1c_f727_44d8_96d7_0c0ce0c06d8b.slice/crio-19b047847fcc5fe82e17f3a7dba6f5b56774d6d26c21e0dbb8a446a9ec9f4e11 WatchSource:0}: Error finding container 19b047847fcc5fe82e17f3a7dba6f5b56774d6d26c21e0dbb8a446a9ec9f4e11: Status 404 returned error can't find the container with id 19b047847fcc5fe82e17f3a7dba6f5b56774d6d26c21e0dbb8a446a9ec9f4e11 Mar 13 09:52:01 crc kubenswrapper[4841]: I0313 09:52:01.755244 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556592-25g9w" event={"ID":"a0591f1c-f727-44d8-96d7-0c0ce0c06d8b","Type":"ContainerStarted","Data":"19b047847fcc5fe82e17f3a7dba6f5b56774d6d26c21e0dbb8a446a9ec9f4e11"} Mar 13 09:52:02 crc kubenswrapper[4841]: I0313 09:52:02.764469 4841 generic.go:334] "Generic (PLEG): container finished" podID="a0591f1c-f727-44d8-96d7-0c0ce0c06d8b" containerID="a397331b38775f9f7cc6f64c76ff17ce1493bf6541c58d088ba28693bfc2ee8b" exitCode=0 Mar 13 09:52:02 crc kubenswrapper[4841]: I0313 09:52:02.764586 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556592-25g9w" event={"ID":"a0591f1c-f727-44d8-96d7-0c0ce0c06d8b","Type":"ContainerDied","Data":"a397331b38775f9f7cc6f64c76ff17ce1493bf6541c58d088ba28693bfc2ee8b"} Mar 13 09:52:02 crc kubenswrapper[4841]: I0313 09:52:02.995439 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:52:02 crc kubenswrapper[4841]: E0313 09:52:02.995705 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:52:04 crc kubenswrapper[4841]: I0313 09:52:04.104836 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556592-25g9w" Mar 13 09:52:04 crc kubenswrapper[4841]: I0313 09:52:04.222941 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzgx9\" (UniqueName: \"kubernetes.io/projected/a0591f1c-f727-44d8-96d7-0c0ce0c06d8b-kube-api-access-lzgx9\") pod \"a0591f1c-f727-44d8-96d7-0c0ce0c06d8b\" (UID: \"a0591f1c-f727-44d8-96d7-0c0ce0c06d8b\") " Mar 13 09:52:04 crc kubenswrapper[4841]: I0313 09:52:04.230447 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0591f1c-f727-44d8-96d7-0c0ce0c06d8b-kube-api-access-lzgx9" (OuterVolumeSpecName: "kube-api-access-lzgx9") pod "a0591f1c-f727-44d8-96d7-0c0ce0c06d8b" (UID: "a0591f1c-f727-44d8-96d7-0c0ce0c06d8b"). InnerVolumeSpecName "kube-api-access-lzgx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:52:04 crc kubenswrapper[4841]: I0313 09:52:04.324980 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzgx9\" (UniqueName: \"kubernetes.io/projected/a0591f1c-f727-44d8-96d7-0c0ce0c06d8b-kube-api-access-lzgx9\") on node \"crc\" DevicePath \"\"" Mar 13 09:52:04 crc kubenswrapper[4841]: I0313 09:52:04.786618 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556592-25g9w" event={"ID":"a0591f1c-f727-44d8-96d7-0c0ce0c06d8b","Type":"ContainerDied","Data":"19b047847fcc5fe82e17f3a7dba6f5b56774d6d26c21e0dbb8a446a9ec9f4e11"} Mar 13 09:52:04 crc kubenswrapper[4841]: I0313 09:52:04.786664 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19b047847fcc5fe82e17f3a7dba6f5b56774d6d26c21e0dbb8a446a9ec9f4e11" Mar 13 09:52:04 crc kubenswrapper[4841]: I0313 09:52:04.786697 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556592-25g9w" Mar 13 09:52:05 crc kubenswrapper[4841]: I0313 09:52:05.178226 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556586-rdpdd"] Mar 13 09:52:05 crc kubenswrapper[4841]: I0313 09:52:05.186259 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556586-rdpdd"] Mar 13 09:52:06 crc kubenswrapper[4841]: I0313 09:52:06.009921 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b34fcbdd-3170-44e1-a881-867750de01ec" path="/var/lib/kubelet/pods/b34fcbdd-3170-44e1-a881-867750de01ec/volumes" Mar 13 09:52:16 crc kubenswrapper[4841]: I0313 09:52:16.996020 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:52:16 crc kubenswrapper[4841]: E0313 09:52:16.996777 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:52:30 crc kubenswrapper[4841]: I0313 09:52:30.995691 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:52:30 crc kubenswrapper[4841]: E0313 09:52:30.996899 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:52:41 crc kubenswrapper[4841]: I0313 09:52:41.995222 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:52:41 crc kubenswrapper[4841]: E0313 09:52:41.996504 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:52:52 crc kubenswrapper[4841]: I0313 09:52:52.995881 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:52:52 crc kubenswrapper[4841]: E0313 09:52:52.996584 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:53:04 crc kubenswrapper[4841]: I0313 09:53:04.581862 4841 scope.go:117] "RemoveContainer" containerID="c30102bcf27959a8b287ceaf8a530489b1f92abf956176d87db776b0023fccad" Mar 13 09:53:08 crc kubenswrapper[4841]: I0313 09:53:08.007386 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:53:08 crc kubenswrapper[4841]: E0313 09:53:08.008469 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:53:15 crc kubenswrapper[4841]: I0313 09:53:15.530445 4841 generic.go:334] "Generic (PLEG): container finished" podID="7f7ae341-a1c6-49f6-825c-4c47b14141f4" containerID="48006d2667902eaaa6755ccfae24a7e3241e606f5c93f9b7a7abd848abfa6aa2" exitCode=0 Mar 13 09:53:15 crc kubenswrapper[4841]: I0313 09:53:15.530561 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" event={"ID":"7f7ae341-a1c6-49f6-825c-4c47b14141f4","Type":"ContainerDied","Data":"48006d2667902eaaa6755ccfae24a7e3241e606f5c93f9b7a7abd848abfa6aa2"} Mar 13 09:53:16 crc kubenswrapper[4841]: I0313 09:53:16.931778 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.020135 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-extra-config-0\") pod \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.020210 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-migration-ssh-key-1\") pod \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.020231 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-combined-ca-bundle\") pod \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.020251 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-1\") pod \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.020283 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-inventory\") pod \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.020367 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-0\") pod \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.020388 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-migration-ssh-key-0\") pod \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.020408 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-2\") pod \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.020488 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj78x\" (UniqueName: \"kubernetes.io/projected/7f7ae341-a1c6-49f6-825c-4c47b14141f4-kube-api-access-kj78x\") pod \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.020577 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-3\") pod \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.020616 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-ssh-key-openstack-edpm-ipam\") pod \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\" (UID: \"7f7ae341-a1c6-49f6-825c-4c47b14141f4\") " Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.026185 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7f7ae341-a1c6-49f6-825c-4c47b14141f4" (UID: "7f7ae341-a1c6-49f6-825c-4c47b14141f4"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.031428 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f7ae341-a1c6-49f6-825c-4c47b14141f4-kube-api-access-kj78x" (OuterVolumeSpecName: "kube-api-access-kj78x") pod "7f7ae341-a1c6-49f6-825c-4c47b14141f4" (UID: "7f7ae341-a1c6-49f6-825c-4c47b14141f4"). InnerVolumeSpecName "kube-api-access-kj78x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.049877 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "7f7ae341-a1c6-49f6-825c-4c47b14141f4" (UID: "7f7ae341-a1c6-49f6-825c-4c47b14141f4"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.054000 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "7f7ae341-a1c6-49f6-825c-4c47b14141f4" (UID: "7f7ae341-a1c6-49f6-825c-4c47b14141f4"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.054541 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "7f7ae341-a1c6-49f6-825c-4c47b14141f4" (UID: "7f7ae341-a1c6-49f6-825c-4c47b14141f4"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.055800 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7f7ae341-a1c6-49f6-825c-4c47b14141f4" (UID: "7f7ae341-a1c6-49f6-825c-4c47b14141f4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.055941 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "7f7ae341-a1c6-49f6-825c-4c47b14141f4" (UID: "7f7ae341-a1c6-49f6-825c-4c47b14141f4"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.057775 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-inventory" (OuterVolumeSpecName: "inventory") pod "7f7ae341-a1c6-49f6-825c-4c47b14141f4" (UID: "7f7ae341-a1c6-49f6-825c-4c47b14141f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.059489 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "7f7ae341-a1c6-49f6-825c-4c47b14141f4" (UID: "7f7ae341-a1c6-49f6-825c-4c47b14141f4"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.063934 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "7f7ae341-a1c6-49f6-825c-4c47b14141f4" (UID: "7f7ae341-a1c6-49f6-825c-4c47b14141f4"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.066821 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "7f7ae341-a1c6-49f6-825c-4c47b14141f4" (UID: "7f7ae341-a1c6-49f6-825c-4c47b14141f4"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.122893 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.122961 4841 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.122974 4841 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.122983 4841 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.122991 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj78x\" (UniqueName: \"kubernetes.io/projected/7f7ae341-a1c6-49f6-825c-4c47b14141f4-kube-api-access-kj78x\") on node \"crc\" DevicePath \"\"" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.123001 4841 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.123009 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.123044 4841 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.123053 4841 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.123061 4841 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.123069 4841 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7f7ae341-a1c6-49f6-825c-4c47b14141f4-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.554092 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" event={"ID":"7f7ae341-a1c6-49f6-825c-4c47b14141f4","Type":"ContainerDied","Data":"983defb13459b5b2d9959ee6491199e6cb127d6624c11a8730d9afc001a20c31"} Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.554443 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="983defb13459b5b2d9959ee6491199e6cb127d6624c11a8730d9afc001a20c31" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.554127 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-v22pp" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.685196 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh"] Mar 13 09:53:17 crc kubenswrapper[4841]: E0313 09:53:17.685824 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f7ae341-a1c6-49f6-825c-4c47b14141f4" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.685853 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f7ae341-a1c6-49f6-825c-4c47b14141f4" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 09:53:17 crc kubenswrapper[4841]: E0313 09:53:17.685868 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0591f1c-f727-44d8-96d7-0c0ce0c06d8b" containerName="oc" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.685879 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0591f1c-f727-44d8-96d7-0c0ce0c06d8b" containerName="oc" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.686201 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f7ae341-a1c6-49f6-825c-4c47b14141f4" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.686233 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0591f1c-f727-44d8-96d7-0c0ce0c06d8b" containerName="oc" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.687237 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.689985 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rv6jt" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.690242 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.690561 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.691825 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.691926 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.693582 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh"] Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.735016 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.735076 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.735210 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.735261 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.735308 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.735462 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2npkd\" (UniqueName: \"kubernetes.io/projected/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-kube-api-access-2npkd\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.735521 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.836680 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2npkd\" (UniqueName: \"kubernetes.io/projected/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-kube-api-access-2npkd\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.836726 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.836794 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.836818 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.836868 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.836890 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.836915 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.840890 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.841466 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.841848 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.842506 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.842654 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.842823 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:17 crc kubenswrapper[4841]: I0313 09:53:17.861921 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2npkd\" (UniqueName: \"kubernetes.io/projected/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-kube-api-access-2npkd\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:18 crc kubenswrapper[4841]: I0313 09:53:18.009612 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:53:18 crc kubenswrapper[4841]: I0313 09:53:18.568096 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh"] Mar 13 09:53:18 crc kubenswrapper[4841]: I0313 09:53:18.995719 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:53:18 crc kubenswrapper[4841]: E0313 09:53:18.996410 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:53:19 crc kubenswrapper[4841]: I0313 09:53:19.579073 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" event={"ID":"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac","Type":"ContainerStarted","Data":"f291cc8f3619463595d99653d091543b8678d35ff4951eaefceb6daa25b48390"} Mar 13 09:53:19 crc kubenswrapper[4841]: I0313 09:53:19.579140 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" event={"ID":"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac","Type":"ContainerStarted","Data":"071429a657cc961f891a7f252aefdeff189007457a676c2c24b98e5413aa634e"} Mar 13 09:53:19 crc kubenswrapper[4841]: I0313 09:53:19.605168 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" podStartSLOduration=2.177305816 podStartE2EDuration="2.605150084s" podCreationTimestamp="2026-03-13 09:53:17 +0000 UTC" firstStartedPulling="2026-03-13 09:53:18.58189946 +0000 UTC m=+2481.311799651" lastFinishedPulling="2026-03-13 09:53:19.009743718 +0000 UTC m=+2481.739643919" observedRunningTime="2026-03-13 09:53:19.598344419 +0000 UTC m=+2482.328244630" watchObservedRunningTime="2026-03-13 09:53:19.605150084 +0000 UTC m=+2482.335050275" Mar 13 09:53:30 crc kubenswrapper[4841]: I0313 09:53:30.995613 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:53:30 crc kubenswrapper[4841]: E0313 09:53:30.996718 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:53:41 crc kubenswrapper[4841]: I0313 09:53:41.995620 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:53:41 crc kubenswrapper[4841]: E0313 09:53:41.996427 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:53:56 crc kubenswrapper[4841]: I0313 09:53:56.995342 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:53:56 crc kubenswrapper[4841]: E0313 09:53:56.996182 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:54:00 crc kubenswrapper[4841]: I0313 09:54:00.161478 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556594-zbfhh"] Mar 13 09:54:00 crc kubenswrapper[4841]: I0313 09:54:00.165131 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556594-zbfhh" Mar 13 09:54:00 crc kubenswrapper[4841]: I0313 09:54:00.167546 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:54:00 crc kubenswrapper[4841]: I0313 09:54:00.167553 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:54:00 crc kubenswrapper[4841]: I0313 09:54:00.167605 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:54:00 crc kubenswrapper[4841]: I0313 09:54:00.171828 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556594-zbfhh"] Mar 13 09:54:00 crc kubenswrapper[4841]: I0313 09:54:00.321337 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjpdp\" (UniqueName: \"kubernetes.io/projected/fbcc33bb-e899-4aee-8a7f-2f97beb83543-kube-api-access-bjpdp\") pod \"auto-csr-approver-29556594-zbfhh\" (UID: \"fbcc33bb-e899-4aee-8a7f-2f97beb83543\") " pod="openshift-infra/auto-csr-approver-29556594-zbfhh" Mar 13 09:54:00 crc kubenswrapper[4841]: I0313 09:54:00.424174 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjpdp\" (UniqueName: \"kubernetes.io/projected/fbcc33bb-e899-4aee-8a7f-2f97beb83543-kube-api-access-bjpdp\") pod \"auto-csr-approver-29556594-zbfhh\" (UID: \"fbcc33bb-e899-4aee-8a7f-2f97beb83543\") " pod="openshift-infra/auto-csr-approver-29556594-zbfhh" Mar 13 09:54:00 crc kubenswrapper[4841]: I0313 09:54:00.445009 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjpdp\" (UniqueName: \"kubernetes.io/projected/fbcc33bb-e899-4aee-8a7f-2f97beb83543-kube-api-access-bjpdp\") pod \"auto-csr-approver-29556594-zbfhh\" (UID: \"fbcc33bb-e899-4aee-8a7f-2f97beb83543\") " pod="openshift-infra/auto-csr-approver-29556594-zbfhh" Mar 13 09:54:00 crc kubenswrapper[4841]: I0313 09:54:00.483413 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556594-zbfhh" Mar 13 09:54:00 crc kubenswrapper[4841]: I0313 09:54:00.940981 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556594-zbfhh"] Mar 13 09:54:00 crc kubenswrapper[4841]: I0313 09:54:00.977470 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556594-zbfhh" event={"ID":"fbcc33bb-e899-4aee-8a7f-2f97beb83543","Type":"ContainerStarted","Data":"bb3990fe0e321c96c5bffce228a88d2c76fca99d4eb536b99fe27fee454afa6f"} Mar 13 09:54:03 crc kubenswrapper[4841]: I0313 09:54:03.001187 4841 generic.go:334] "Generic (PLEG): container finished" podID="fbcc33bb-e899-4aee-8a7f-2f97beb83543" containerID="06eb8fef5e582cbf6ca81120e42cdac1e2257a0ab9785b210a2ca494d491f9ca" exitCode=0 Mar 13 09:54:03 crc kubenswrapper[4841]: I0313 09:54:03.001292 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556594-zbfhh" event={"ID":"fbcc33bb-e899-4aee-8a7f-2f97beb83543","Type":"ContainerDied","Data":"06eb8fef5e582cbf6ca81120e42cdac1e2257a0ab9785b210a2ca494d491f9ca"} Mar 13 09:54:04 crc kubenswrapper[4841]: I0313 09:54:04.338061 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556594-zbfhh" Mar 13 09:54:04 crc kubenswrapper[4841]: I0313 09:54:04.506092 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjpdp\" (UniqueName: \"kubernetes.io/projected/fbcc33bb-e899-4aee-8a7f-2f97beb83543-kube-api-access-bjpdp\") pod \"fbcc33bb-e899-4aee-8a7f-2f97beb83543\" (UID: \"fbcc33bb-e899-4aee-8a7f-2f97beb83543\") " Mar 13 09:54:04 crc kubenswrapper[4841]: I0313 09:54:04.510904 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbcc33bb-e899-4aee-8a7f-2f97beb83543-kube-api-access-bjpdp" (OuterVolumeSpecName: "kube-api-access-bjpdp") pod "fbcc33bb-e899-4aee-8a7f-2f97beb83543" (UID: "fbcc33bb-e899-4aee-8a7f-2f97beb83543"). InnerVolumeSpecName "kube-api-access-bjpdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:54:04 crc kubenswrapper[4841]: I0313 09:54:04.608157 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjpdp\" (UniqueName: \"kubernetes.io/projected/fbcc33bb-e899-4aee-8a7f-2f97beb83543-kube-api-access-bjpdp\") on node \"crc\" DevicePath \"\"" Mar 13 09:54:05 crc kubenswrapper[4841]: I0313 09:54:05.020071 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556594-zbfhh" event={"ID":"fbcc33bb-e899-4aee-8a7f-2f97beb83543","Type":"ContainerDied","Data":"bb3990fe0e321c96c5bffce228a88d2c76fca99d4eb536b99fe27fee454afa6f"} Mar 13 09:54:05 crc kubenswrapper[4841]: I0313 09:54:05.020112 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb3990fe0e321c96c5bffce228a88d2c76fca99d4eb536b99fe27fee454afa6f" Mar 13 09:54:05 crc kubenswrapper[4841]: I0313 09:54:05.020422 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556594-zbfhh" Mar 13 09:54:05 crc kubenswrapper[4841]: I0313 09:54:05.407738 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556588-x9hdr"] Mar 13 09:54:05 crc kubenswrapper[4841]: I0313 09:54:05.415512 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556588-x9hdr"] Mar 13 09:54:06 crc kubenswrapper[4841]: I0313 09:54:06.029549 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffe08ae1-505d-46ee-934a-815f0b81fd7b" path="/var/lib/kubelet/pods/ffe08ae1-505d-46ee-934a-815f0b81fd7b/volumes" Mar 13 09:54:09 crc kubenswrapper[4841]: I0313 09:54:09.994665 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:54:09 crc kubenswrapper[4841]: E0313 09:54:09.994894 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:54:20 crc kubenswrapper[4841]: I0313 09:54:20.995117 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:54:20 crc kubenswrapper[4841]: E0313 09:54:20.995950 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:54:35 crc kubenswrapper[4841]: I0313 09:54:35.995395 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:54:35 crc kubenswrapper[4841]: E0313 09:54:35.996212 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:54:49 crc kubenswrapper[4841]: I0313 09:54:49.995088 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:54:49 crc kubenswrapper[4841]: E0313 09:54:49.996590 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 09:55:04 crc kubenswrapper[4841]: I0313 09:55:04.710923 4841 scope.go:117] "RemoveContainer" containerID="0333bc38be1b72015dc29b4c2dff0d5fdea6a77eb0a839107294b58b784d8fb0" Mar 13 09:55:04 crc kubenswrapper[4841]: I0313 09:55:04.995746 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:55:05 crc kubenswrapper[4841]: I0313 09:55:05.624766 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"356a0566b4669fa620204281729f9d4a5c82961594bf8afbfe26e440c8bc1ad1"} Mar 13 09:55:39 crc kubenswrapper[4841]: E0313 09:55:39.546705 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bdd30cd_856d_44bd_8a1f_b68c7291b0ac.slice/crio-conmon-f291cc8f3619463595d99653d091543b8678d35ff4951eaefceb6daa25b48390.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bdd30cd_856d_44bd_8a1f_b68c7291b0ac.slice/crio-f291cc8f3619463595d99653d091543b8678d35ff4951eaefceb6daa25b48390.scope\": RecentStats: unable to find data in memory cache]" Mar 13 09:55:39 crc kubenswrapper[4841]: I0313 09:55:39.979624 4841 generic.go:334] "Generic (PLEG): container finished" podID="2bdd30cd-856d-44bd-8a1f-b68c7291b0ac" containerID="f291cc8f3619463595d99653d091543b8678d35ff4951eaefceb6daa25b48390" exitCode=0 Mar 13 09:55:39 crc kubenswrapper[4841]: I0313 09:55:39.979693 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" event={"ID":"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac","Type":"ContainerDied","Data":"f291cc8f3619463595d99653d091543b8678d35ff4951eaefceb6daa25b48390"} Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.411501 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.506413 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2npkd\" (UniqueName: \"kubernetes.io/projected/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-kube-api-access-2npkd\") pod \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.506537 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ceilometer-compute-config-data-0\") pod \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.506656 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ceilometer-compute-config-data-1\") pod \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.506721 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-inventory\") pod \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.506765 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ssh-key-openstack-edpm-ipam\") pod \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.506857 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ceilometer-compute-config-data-2\") pod \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.506906 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-telemetry-combined-ca-bundle\") pod \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\" (UID: \"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac\") " Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.514199 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "2bdd30cd-856d-44bd-8a1f-b68c7291b0ac" (UID: "2bdd30cd-856d-44bd-8a1f-b68c7291b0ac"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.514200 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-kube-api-access-2npkd" (OuterVolumeSpecName: "kube-api-access-2npkd") pod "2bdd30cd-856d-44bd-8a1f-b68c7291b0ac" (UID: "2bdd30cd-856d-44bd-8a1f-b68c7291b0ac"). InnerVolumeSpecName "kube-api-access-2npkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.537951 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2bdd30cd-856d-44bd-8a1f-b68c7291b0ac" (UID: "2bdd30cd-856d-44bd-8a1f-b68c7291b0ac"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.538729 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "2bdd30cd-856d-44bd-8a1f-b68c7291b0ac" (UID: "2bdd30cd-856d-44bd-8a1f-b68c7291b0ac"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.539024 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-inventory" (OuterVolumeSpecName: "inventory") pod "2bdd30cd-856d-44bd-8a1f-b68c7291b0ac" (UID: "2bdd30cd-856d-44bd-8a1f-b68c7291b0ac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.544688 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "2bdd30cd-856d-44bd-8a1f-b68c7291b0ac" (UID: "2bdd30cd-856d-44bd-8a1f-b68c7291b0ac"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.552006 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "2bdd30cd-856d-44bd-8a1f-b68c7291b0ac" (UID: "2bdd30cd-856d-44bd-8a1f-b68c7291b0ac"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.609422 4841 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.609647 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.609725 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.609799 4841 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.609900 4841 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.609956 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2npkd\" (UniqueName: \"kubernetes.io/projected/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-kube-api-access-2npkd\") on node \"crc\" DevicePath \"\"" Mar 13 09:55:41 crc kubenswrapper[4841]: I0313 09:55:41.610016 4841 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2bdd30cd-856d-44bd-8a1f-b68c7291b0ac-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 13 09:55:42 crc kubenswrapper[4841]: I0313 09:55:42.002738 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" Mar 13 09:55:42 crc kubenswrapper[4841]: I0313 09:55:42.012296 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh" event={"ID":"2bdd30cd-856d-44bd-8a1f-b68c7291b0ac","Type":"ContainerDied","Data":"071429a657cc961f891a7f252aefdeff189007457a676c2c24b98e5413aa634e"} Mar 13 09:55:42 crc kubenswrapper[4841]: I0313 09:55:42.012359 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="071429a657cc961f891a7f252aefdeff189007457a676c2c24b98e5413aa634e" Mar 13 09:56:00 crc kubenswrapper[4841]: I0313 09:56:00.155576 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556596-nv6jl"] Mar 13 09:56:00 crc kubenswrapper[4841]: E0313 09:56:00.156599 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbcc33bb-e899-4aee-8a7f-2f97beb83543" containerName="oc" Mar 13 09:56:00 crc kubenswrapper[4841]: I0313 09:56:00.156614 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbcc33bb-e899-4aee-8a7f-2f97beb83543" containerName="oc" Mar 13 09:56:00 crc kubenswrapper[4841]: E0313 09:56:00.156629 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bdd30cd-856d-44bd-8a1f-b68c7291b0ac" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 09:56:00 crc kubenswrapper[4841]: I0313 09:56:00.156640 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bdd30cd-856d-44bd-8a1f-b68c7291b0ac" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 09:56:00 crc kubenswrapper[4841]: I0313 09:56:00.156866 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bdd30cd-856d-44bd-8a1f-b68c7291b0ac" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 09:56:00 crc kubenswrapper[4841]: I0313 09:56:00.156881 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbcc33bb-e899-4aee-8a7f-2f97beb83543" containerName="oc" Mar 13 09:56:00 crc kubenswrapper[4841]: I0313 09:56:00.157615 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556596-nv6jl" Mar 13 09:56:00 crc kubenswrapper[4841]: I0313 09:56:00.160121 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:56:00 crc kubenswrapper[4841]: I0313 09:56:00.161684 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:56:00 crc kubenswrapper[4841]: I0313 09:56:00.161795 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:56:00 crc kubenswrapper[4841]: I0313 09:56:00.170162 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556596-nv6jl"] Mar 13 09:56:00 crc kubenswrapper[4841]: I0313 09:56:00.194590 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttqbr\" (UniqueName: \"kubernetes.io/projected/6f926f80-bd81-4cf9-991e-17c01455ee8a-kube-api-access-ttqbr\") pod \"auto-csr-approver-29556596-nv6jl\" (UID: \"6f926f80-bd81-4cf9-991e-17c01455ee8a\") " pod="openshift-infra/auto-csr-approver-29556596-nv6jl" Mar 13 09:56:00 crc kubenswrapper[4841]: I0313 09:56:00.296650 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttqbr\" (UniqueName: \"kubernetes.io/projected/6f926f80-bd81-4cf9-991e-17c01455ee8a-kube-api-access-ttqbr\") pod \"auto-csr-approver-29556596-nv6jl\" (UID: \"6f926f80-bd81-4cf9-991e-17c01455ee8a\") " pod="openshift-infra/auto-csr-approver-29556596-nv6jl" Mar 13 09:56:00 crc kubenswrapper[4841]: I0313 09:56:00.319527 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttqbr\" (UniqueName: \"kubernetes.io/projected/6f926f80-bd81-4cf9-991e-17c01455ee8a-kube-api-access-ttqbr\") pod \"auto-csr-approver-29556596-nv6jl\" (UID: \"6f926f80-bd81-4cf9-991e-17c01455ee8a\") " pod="openshift-infra/auto-csr-approver-29556596-nv6jl" Mar 13 09:56:00 crc kubenswrapper[4841]: I0313 09:56:00.491173 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556596-nv6jl" Mar 13 09:56:00 crc kubenswrapper[4841]: I0313 09:56:00.937341 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 09:56:00 crc kubenswrapper[4841]: I0313 09:56:00.943170 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556596-nv6jl"] Mar 13 09:56:01 crc kubenswrapper[4841]: I0313 09:56:01.193931 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556596-nv6jl" event={"ID":"6f926f80-bd81-4cf9-991e-17c01455ee8a","Type":"ContainerStarted","Data":"fa5846b3ea08993371d8fffb51b364e058b9c2a4eef8b06dc0bee299a8f5e15f"} Mar 13 09:56:02 crc kubenswrapper[4841]: I0313 09:56:02.203381 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556596-nv6jl" event={"ID":"6f926f80-bd81-4cf9-991e-17c01455ee8a","Type":"ContainerStarted","Data":"f972b828b5e15c0256a1e19ac41c80e97d71b89ae99f857d028f4c3b0b316aee"} Mar 13 09:56:02 crc kubenswrapper[4841]: I0313 09:56:02.231816 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556596-nv6jl" podStartSLOduration=1.432682046 podStartE2EDuration="2.231787852s" podCreationTimestamp="2026-03-13 09:56:00 +0000 UTC" firstStartedPulling="2026-03-13 09:56:00.936938877 +0000 UTC m=+2643.666839078" lastFinishedPulling="2026-03-13 09:56:01.736044703 +0000 UTC m=+2644.465944884" observedRunningTime="2026-03-13 09:56:02.21956914 +0000 UTC m=+2644.949469351" watchObservedRunningTime="2026-03-13 09:56:02.231787852 +0000 UTC m=+2644.961688063" Mar 13 09:56:03 crc kubenswrapper[4841]: I0313 09:56:03.216314 4841 generic.go:334] "Generic (PLEG): container finished" podID="6f926f80-bd81-4cf9-991e-17c01455ee8a" containerID="f972b828b5e15c0256a1e19ac41c80e97d71b89ae99f857d028f4c3b0b316aee" exitCode=0 Mar 13 09:56:03 crc kubenswrapper[4841]: I0313 09:56:03.216771 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556596-nv6jl" event={"ID":"6f926f80-bd81-4cf9-991e-17c01455ee8a","Type":"ContainerDied","Data":"f972b828b5e15c0256a1e19ac41c80e97d71b89ae99f857d028f4c3b0b316aee"} Mar 13 09:56:04 crc kubenswrapper[4841]: I0313 09:56:04.653713 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556596-nv6jl" Mar 13 09:56:04 crc kubenswrapper[4841]: I0313 09:56:04.687779 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttqbr\" (UniqueName: \"kubernetes.io/projected/6f926f80-bd81-4cf9-991e-17c01455ee8a-kube-api-access-ttqbr\") pod \"6f926f80-bd81-4cf9-991e-17c01455ee8a\" (UID: \"6f926f80-bd81-4cf9-991e-17c01455ee8a\") " Mar 13 09:56:04 crc kubenswrapper[4841]: I0313 09:56:04.727778 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f926f80-bd81-4cf9-991e-17c01455ee8a-kube-api-access-ttqbr" (OuterVolumeSpecName: "kube-api-access-ttqbr") pod "6f926f80-bd81-4cf9-991e-17c01455ee8a" (UID: "6f926f80-bd81-4cf9-991e-17c01455ee8a"). InnerVolumeSpecName "kube-api-access-ttqbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:56:04 crc kubenswrapper[4841]: I0313 09:56:04.790503 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttqbr\" (UniqueName: \"kubernetes.io/projected/6f926f80-bd81-4cf9-991e-17c01455ee8a-kube-api-access-ttqbr\") on node \"crc\" DevicePath \"\"" Mar 13 09:56:05 crc kubenswrapper[4841]: I0313 09:56:05.236863 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556596-nv6jl" event={"ID":"6f926f80-bd81-4cf9-991e-17c01455ee8a","Type":"ContainerDied","Data":"fa5846b3ea08993371d8fffb51b364e058b9c2a4eef8b06dc0bee299a8f5e15f"} Mar 13 09:56:05 crc kubenswrapper[4841]: I0313 09:56:05.236898 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556596-nv6jl" Mar 13 09:56:05 crc kubenswrapper[4841]: I0313 09:56:05.236916 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa5846b3ea08993371d8fffb51b364e058b9c2a4eef8b06dc0bee299a8f5e15f" Mar 13 09:56:05 crc kubenswrapper[4841]: I0313 09:56:05.299401 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556590-kkm25"] Mar 13 09:56:05 crc kubenswrapper[4841]: I0313 09:56:05.307917 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556590-kkm25"] Mar 13 09:56:06 crc kubenswrapper[4841]: I0313 09:56:06.008668 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21765abc-b94f-4650-8aaf-3a917c3f655d" path="/var/lib/kubelet/pods/21765abc-b94f-4650-8aaf-3a917c3f655d/volumes" Mar 13 09:56:42 crc kubenswrapper[4841]: I0313 09:56:42.244784 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j7vxw"] Mar 13 09:56:42 crc kubenswrapper[4841]: E0313 09:56:42.248761 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f926f80-bd81-4cf9-991e-17c01455ee8a" containerName="oc" Mar 13 09:56:42 crc kubenswrapper[4841]: I0313 09:56:42.248789 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f926f80-bd81-4cf9-991e-17c01455ee8a" containerName="oc" Mar 13 09:56:42 crc kubenswrapper[4841]: I0313 09:56:42.249113 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f926f80-bd81-4cf9-991e-17c01455ee8a" containerName="oc" Mar 13 09:56:42 crc kubenswrapper[4841]: I0313 09:56:42.251416 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7vxw" Mar 13 09:56:42 crc kubenswrapper[4841]: I0313 09:56:42.261153 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j7vxw"] Mar 13 09:56:42 crc kubenswrapper[4841]: I0313 09:56:42.379971 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9-catalog-content\") pod \"redhat-operators-j7vxw\" (UID: \"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9\") " pod="openshift-marketplace/redhat-operators-j7vxw" Mar 13 09:56:42 crc kubenswrapper[4841]: I0313 09:56:42.380258 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gqwn\" (UniqueName: \"kubernetes.io/projected/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9-kube-api-access-9gqwn\") pod \"redhat-operators-j7vxw\" (UID: \"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9\") " pod="openshift-marketplace/redhat-operators-j7vxw" Mar 13 09:56:42 crc kubenswrapper[4841]: I0313 09:56:42.380480 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9-utilities\") pod \"redhat-operators-j7vxw\" (UID: \"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9\") " pod="openshift-marketplace/redhat-operators-j7vxw" Mar 13 09:56:42 crc kubenswrapper[4841]: I0313 09:56:42.482365 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9-catalog-content\") pod \"redhat-operators-j7vxw\" (UID: \"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9\") " pod="openshift-marketplace/redhat-operators-j7vxw" Mar 13 09:56:42 crc kubenswrapper[4841]: I0313 09:56:42.482408 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gqwn\" (UniqueName: \"kubernetes.io/projected/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9-kube-api-access-9gqwn\") pod \"redhat-operators-j7vxw\" (UID: \"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9\") " pod="openshift-marketplace/redhat-operators-j7vxw" Mar 13 09:56:42 crc kubenswrapper[4841]: I0313 09:56:42.482499 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9-utilities\") pod \"redhat-operators-j7vxw\" (UID: \"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9\") " pod="openshift-marketplace/redhat-operators-j7vxw" Mar 13 09:56:42 crc kubenswrapper[4841]: I0313 09:56:42.482887 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9-catalog-content\") pod \"redhat-operators-j7vxw\" (UID: \"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9\") " pod="openshift-marketplace/redhat-operators-j7vxw" Mar 13 09:56:42 crc kubenswrapper[4841]: I0313 09:56:42.482904 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9-utilities\") pod \"redhat-operators-j7vxw\" (UID: \"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9\") " pod="openshift-marketplace/redhat-operators-j7vxw" Mar 13 09:56:42 crc kubenswrapper[4841]: I0313 09:56:42.504217 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gqwn\" (UniqueName: \"kubernetes.io/projected/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9-kube-api-access-9gqwn\") pod \"redhat-operators-j7vxw\" (UID: \"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9\") " pod="openshift-marketplace/redhat-operators-j7vxw" Mar 13 09:56:42 crc kubenswrapper[4841]: I0313 09:56:42.588470 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7vxw" Mar 13 09:56:43 crc kubenswrapper[4841]: I0313 09:56:43.056070 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j7vxw"] Mar 13 09:56:43 crc kubenswrapper[4841]: I0313 09:56:43.645405 4841 generic.go:334] "Generic (PLEG): container finished" podID="e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9" containerID="7e5e86317c5a78a73a2a2beec151747ed1c4d8bf04941670d591d178e70517be" exitCode=0 Mar 13 09:56:43 crc kubenswrapper[4841]: I0313 09:56:43.645449 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7vxw" event={"ID":"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9","Type":"ContainerDied","Data":"7e5e86317c5a78a73a2a2beec151747ed1c4d8bf04941670d591d178e70517be"} Mar 13 09:56:43 crc kubenswrapper[4841]: I0313 09:56:43.645477 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7vxw" event={"ID":"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9","Type":"ContainerStarted","Data":"fc4c915a02e204372f8f09693d1506a9beda87703c5c765c3ca4b987fb25ef92"} Mar 13 09:56:44 crc kubenswrapper[4841]: I0313 09:56:44.655029 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7vxw" event={"ID":"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9","Type":"ContainerStarted","Data":"f89384e7d8062d771bd52b456f744df0150e7bb171362db8c3f4a6af699df42d"} Mar 13 09:56:45 crc kubenswrapper[4841]: I0313 09:56:45.668853 4841 generic.go:334] "Generic (PLEG): container finished" podID="e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9" containerID="f89384e7d8062d771bd52b456f744df0150e7bb171362db8c3f4a6af699df42d" exitCode=0 Mar 13 09:56:45 crc kubenswrapper[4841]: I0313 09:56:45.668927 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7vxw" event={"ID":"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9","Type":"ContainerDied","Data":"f89384e7d8062d771bd52b456f744df0150e7bb171362db8c3f4a6af699df42d"} Mar 13 09:56:46 crc kubenswrapper[4841]: I0313 09:56:46.680882 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7vxw" event={"ID":"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9","Type":"ContainerStarted","Data":"18f48dd0e1840dde04063ba0e444962bf0334d672d1fc9e64e88dadd0d28708a"} Mar 13 09:56:46 crc kubenswrapper[4841]: I0313 09:56:46.720004 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j7vxw" podStartSLOduration=2.215850048 podStartE2EDuration="4.719979336s" podCreationTimestamp="2026-03-13 09:56:42 +0000 UTC" firstStartedPulling="2026-03-13 09:56:43.648820542 +0000 UTC m=+2686.378720733" lastFinishedPulling="2026-03-13 09:56:46.15294983 +0000 UTC m=+2688.882850021" observedRunningTime="2026-03-13 09:56:46.714547096 +0000 UTC m=+2689.444447287" watchObservedRunningTime="2026-03-13 09:56:46.719979336 +0000 UTC m=+2689.449879537" Mar 13 09:56:49 crc kubenswrapper[4841]: I0313 09:56:49.627366 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bb7jq"] Mar 13 09:56:49 crc kubenswrapper[4841]: I0313 09:56:49.629638 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bb7jq" Mar 13 09:56:49 crc kubenswrapper[4841]: I0313 09:56:49.647619 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bb7jq"] Mar 13 09:56:49 crc kubenswrapper[4841]: I0313 09:56:49.747475 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04663b3-d1aa-40e2-b936-c26eb830a33a-catalog-content\") pod \"community-operators-bb7jq\" (UID: \"e04663b3-d1aa-40e2-b936-c26eb830a33a\") " pod="openshift-marketplace/community-operators-bb7jq" Mar 13 09:56:49 crc kubenswrapper[4841]: I0313 09:56:49.747600 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04663b3-d1aa-40e2-b936-c26eb830a33a-utilities\") pod \"community-operators-bb7jq\" (UID: \"e04663b3-d1aa-40e2-b936-c26eb830a33a\") " pod="openshift-marketplace/community-operators-bb7jq" Mar 13 09:56:49 crc kubenswrapper[4841]: I0313 09:56:49.747645 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt84n\" (UniqueName: \"kubernetes.io/projected/e04663b3-d1aa-40e2-b936-c26eb830a33a-kube-api-access-qt84n\") pod \"community-operators-bb7jq\" (UID: \"e04663b3-d1aa-40e2-b936-c26eb830a33a\") " pod="openshift-marketplace/community-operators-bb7jq" Mar 13 09:56:49 crc kubenswrapper[4841]: I0313 09:56:49.849342 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04663b3-d1aa-40e2-b936-c26eb830a33a-utilities\") pod \"community-operators-bb7jq\" (UID: \"e04663b3-d1aa-40e2-b936-c26eb830a33a\") " pod="openshift-marketplace/community-operators-bb7jq" Mar 13 09:56:49 crc kubenswrapper[4841]: I0313 09:56:49.849413 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt84n\" (UniqueName: \"kubernetes.io/projected/e04663b3-d1aa-40e2-b936-c26eb830a33a-kube-api-access-qt84n\") pod \"community-operators-bb7jq\" (UID: \"e04663b3-d1aa-40e2-b936-c26eb830a33a\") " pod="openshift-marketplace/community-operators-bb7jq" Mar 13 09:56:49 crc kubenswrapper[4841]: I0313 09:56:49.849517 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04663b3-d1aa-40e2-b936-c26eb830a33a-catalog-content\") pod \"community-operators-bb7jq\" (UID: \"e04663b3-d1aa-40e2-b936-c26eb830a33a\") " pod="openshift-marketplace/community-operators-bb7jq" Mar 13 09:56:49 crc kubenswrapper[4841]: I0313 09:56:49.849982 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04663b3-d1aa-40e2-b936-c26eb830a33a-catalog-content\") pod \"community-operators-bb7jq\" (UID: \"e04663b3-d1aa-40e2-b936-c26eb830a33a\") " pod="openshift-marketplace/community-operators-bb7jq" Mar 13 09:56:49 crc kubenswrapper[4841]: I0313 09:56:49.850203 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04663b3-d1aa-40e2-b936-c26eb830a33a-utilities\") pod \"community-operators-bb7jq\" (UID: \"e04663b3-d1aa-40e2-b936-c26eb830a33a\") " pod="openshift-marketplace/community-operators-bb7jq" Mar 13 09:56:49 crc kubenswrapper[4841]: I0313 09:56:49.872338 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt84n\" (UniqueName: \"kubernetes.io/projected/e04663b3-d1aa-40e2-b936-c26eb830a33a-kube-api-access-qt84n\") pod \"community-operators-bb7jq\" (UID: \"e04663b3-d1aa-40e2-b936-c26eb830a33a\") " pod="openshift-marketplace/community-operators-bb7jq" Mar 13 09:56:49 crc kubenswrapper[4841]: I0313 09:56:49.979381 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bb7jq" Mar 13 09:56:50 crc kubenswrapper[4841]: I0313 09:56:50.538326 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bb7jq"] Mar 13 09:56:50 crc kubenswrapper[4841]: I0313 09:56:50.710669 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bb7jq" event={"ID":"e04663b3-d1aa-40e2-b936-c26eb830a33a","Type":"ContainerStarted","Data":"31132ef917a817cf70e70fc10d5a60e72d06ba44aad64f834cbc032f6453eac0"} Mar 13 09:56:51 crc kubenswrapper[4841]: I0313 09:56:51.721775 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bb7jq" event={"ID":"e04663b3-d1aa-40e2-b936-c26eb830a33a","Type":"ContainerStarted","Data":"655b3e60486efdf227c8cac3f17c62922982a75847f0dc7f0d61eff49d343657"} Mar 13 09:56:52 crc kubenswrapper[4841]: I0313 09:56:52.589169 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j7vxw" Mar 13 09:56:52 crc kubenswrapper[4841]: I0313 09:56:52.589530 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j7vxw" Mar 13 09:56:52 crc kubenswrapper[4841]: I0313 09:56:52.735196 4841 generic.go:334] "Generic (PLEG): container finished" podID="e04663b3-d1aa-40e2-b936-c26eb830a33a" containerID="655b3e60486efdf227c8cac3f17c62922982a75847f0dc7f0d61eff49d343657" exitCode=0 Mar 13 09:56:52 crc kubenswrapper[4841]: I0313 09:56:52.735305 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bb7jq" event={"ID":"e04663b3-d1aa-40e2-b936-c26eb830a33a","Type":"ContainerDied","Data":"655b3e60486efdf227c8cac3f17c62922982a75847f0dc7f0d61eff49d343657"} Mar 13 09:56:53 crc kubenswrapper[4841]: I0313 09:56:53.635979 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j7vxw" podUID="e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9" containerName="registry-server" probeResult="failure" output=< Mar 13 09:56:53 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Mar 13 09:56:53 crc kubenswrapper[4841]: > Mar 13 09:56:53 crc kubenswrapper[4841]: I0313 09:56:53.749040 4841 generic.go:334] "Generic (PLEG): container finished" podID="e04663b3-d1aa-40e2-b936-c26eb830a33a" containerID="6768fdc1e3a69a38649dbdd2988a9048c0bbe9a43161d86f52729bad423a772b" exitCode=0 Mar 13 09:56:53 crc kubenswrapper[4841]: I0313 09:56:53.749078 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bb7jq" event={"ID":"e04663b3-d1aa-40e2-b936-c26eb830a33a","Type":"ContainerDied","Data":"6768fdc1e3a69a38649dbdd2988a9048c0bbe9a43161d86f52729bad423a772b"} Mar 13 09:56:54 crc kubenswrapper[4841]: I0313 09:56:54.762912 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bb7jq" event={"ID":"e04663b3-d1aa-40e2-b936-c26eb830a33a","Type":"ContainerStarted","Data":"b417596e43e4f84904f5911ffaa90b5ad1516ffc84a0c835799c1502c420d967"} Mar 13 09:56:54 crc kubenswrapper[4841]: I0313 09:56:54.787439 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bb7jq" podStartSLOduration=4.381386985 podStartE2EDuration="5.787420528s" podCreationTimestamp="2026-03-13 09:56:49 +0000 UTC" firstStartedPulling="2026-03-13 09:56:52.737070065 +0000 UTC m=+2695.466970256" lastFinishedPulling="2026-03-13 09:56:54.143103588 +0000 UTC m=+2696.873003799" observedRunningTime="2026-03-13 09:56:54.778457087 +0000 UTC m=+2697.508357308" watchObservedRunningTime="2026-03-13 09:56:54.787420528 +0000 UTC m=+2697.517320719" Mar 13 09:56:59 crc kubenswrapper[4841]: I0313 09:56:59.980399 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bb7jq" Mar 13 09:56:59 crc kubenswrapper[4841]: I0313 09:56:59.982382 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bb7jq" Mar 13 09:57:00 crc kubenswrapper[4841]: I0313 09:57:00.031708 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bb7jq" Mar 13 09:57:00 crc kubenswrapper[4841]: I0313 09:57:00.856612 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bb7jq" Mar 13 09:57:00 crc kubenswrapper[4841]: I0313 09:57:00.899558 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bb7jq"] Mar 13 09:57:02 crc kubenswrapper[4841]: I0313 09:57:02.630354 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j7vxw" Mar 13 09:57:02 crc kubenswrapper[4841]: I0313 09:57:02.679379 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j7vxw" Mar 13 09:57:02 crc kubenswrapper[4841]: I0313 09:57:02.833981 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bb7jq" podUID="e04663b3-d1aa-40e2-b936-c26eb830a33a" containerName="registry-server" containerID="cri-o://b417596e43e4f84904f5911ffaa90b5ad1516ffc84a0c835799c1502c420d967" gracePeriod=2 Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.276439 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bb7jq" Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.413886 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt84n\" (UniqueName: \"kubernetes.io/projected/e04663b3-d1aa-40e2-b936-c26eb830a33a-kube-api-access-qt84n\") pod \"e04663b3-d1aa-40e2-b936-c26eb830a33a\" (UID: \"e04663b3-d1aa-40e2-b936-c26eb830a33a\") " Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.414033 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04663b3-d1aa-40e2-b936-c26eb830a33a-utilities\") pod \"e04663b3-d1aa-40e2-b936-c26eb830a33a\" (UID: \"e04663b3-d1aa-40e2-b936-c26eb830a33a\") " Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.414152 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04663b3-d1aa-40e2-b936-c26eb830a33a-catalog-content\") pod \"e04663b3-d1aa-40e2-b936-c26eb830a33a\" (UID: \"e04663b3-d1aa-40e2-b936-c26eb830a33a\") " Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.414933 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e04663b3-d1aa-40e2-b936-c26eb830a33a-utilities" (OuterVolumeSpecName: "utilities") pod "e04663b3-d1aa-40e2-b936-c26eb830a33a" (UID: "e04663b3-d1aa-40e2-b936-c26eb830a33a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.419883 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e04663b3-d1aa-40e2-b936-c26eb830a33a-kube-api-access-qt84n" (OuterVolumeSpecName: "kube-api-access-qt84n") pod "e04663b3-d1aa-40e2-b936-c26eb830a33a" (UID: "e04663b3-d1aa-40e2-b936-c26eb830a33a"). InnerVolumeSpecName "kube-api-access-qt84n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.476616 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e04663b3-d1aa-40e2-b936-c26eb830a33a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e04663b3-d1aa-40e2-b936-c26eb830a33a" (UID: "e04663b3-d1aa-40e2-b936-c26eb830a33a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.516876 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt84n\" (UniqueName: \"kubernetes.io/projected/e04663b3-d1aa-40e2-b936-c26eb830a33a-kube-api-access-qt84n\") on node \"crc\" DevicePath \"\"" Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.516918 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04663b3-d1aa-40e2-b936-c26eb830a33a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.516932 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04663b3-d1aa-40e2-b936-c26eb830a33a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.843690 4841 generic.go:334] "Generic (PLEG): container finished" podID="e04663b3-d1aa-40e2-b936-c26eb830a33a" containerID="b417596e43e4f84904f5911ffaa90b5ad1516ffc84a0c835799c1502c420d967" exitCode=0 Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.843729 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bb7jq" event={"ID":"e04663b3-d1aa-40e2-b936-c26eb830a33a","Type":"ContainerDied","Data":"b417596e43e4f84904f5911ffaa90b5ad1516ffc84a0c835799c1502c420d967"} Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.843751 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bb7jq" Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.843771 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bb7jq" event={"ID":"e04663b3-d1aa-40e2-b936-c26eb830a33a","Type":"ContainerDied","Data":"31132ef917a817cf70e70fc10d5a60e72d06ba44aad64f834cbc032f6453eac0"} Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.843788 4841 scope.go:117] "RemoveContainer" containerID="b417596e43e4f84904f5911ffaa90b5ad1516ffc84a0c835799c1502c420d967" Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.876299 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j7vxw"] Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.876777 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j7vxw" podUID="e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9" containerName="registry-server" containerID="cri-o://18f48dd0e1840dde04063ba0e444962bf0334d672d1fc9e64e88dadd0d28708a" gracePeriod=2 Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.891984 4841 scope.go:117] "RemoveContainer" containerID="6768fdc1e3a69a38649dbdd2988a9048c0bbe9a43161d86f52729bad423a772b" Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.892304 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bb7jq"] Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.906085 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bb7jq"] Mar 13 09:57:03 crc kubenswrapper[4841]: I0313 09:57:03.913426 4841 scope.go:117] "RemoveContainer" containerID="655b3e60486efdf227c8cac3f17c62922982a75847f0dc7f0d61eff49d343657" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.006380 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e04663b3-d1aa-40e2-b936-c26eb830a33a" path="/var/lib/kubelet/pods/e04663b3-d1aa-40e2-b936-c26eb830a33a/volumes" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.077169 4841 scope.go:117] "RemoveContainer" containerID="b417596e43e4f84904f5911ffaa90b5ad1516ffc84a0c835799c1502c420d967" Mar 13 09:57:04 crc kubenswrapper[4841]: E0313 09:57:04.079702 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b417596e43e4f84904f5911ffaa90b5ad1516ffc84a0c835799c1502c420d967\": container with ID starting with b417596e43e4f84904f5911ffaa90b5ad1516ffc84a0c835799c1502c420d967 not found: ID does not exist" containerID="b417596e43e4f84904f5911ffaa90b5ad1516ffc84a0c835799c1502c420d967" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.079735 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b417596e43e4f84904f5911ffaa90b5ad1516ffc84a0c835799c1502c420d967"} err="failed to get container status \"b417596e43e4f84904f5911ffaa90b5ad1516ffc84a0c835799c1502c420d967\": rpc error: code = NotFound desc = could not find container \"b417596e43e4f84904f5911ffaa90b5ad1516ffc84a0c835799c1502c420d967\": container with ID starting with b417596e43e4f84904f5911ffaa90b5ad1516ffc84a0c835799c1502c420d967 not found: ID does not exist" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.079764 4841 scope.go:117] "RemoveContainer" containerID="6768fdc1e3a69a38649dbdd2988a9048c0bbe9a43161d86f52729bad423a772b" Mar 13 09:57:04 crc kubenswrapper[4841]: E0313 09:57:04.080133 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6768fdc1e3a69a38649dbdd2988a9048c0bbe9a43161d86f52729bad423a772b\": container with ID starting with 6768fdc1e3a69a38649dbdd2988a9048c0bbe9a43161d86f52729bad423a772b not found: ID does not exist" containerID="6768fdc1e3a69a38649dbdd2988a9048c0bbe9a43161d86f52729bad423a772b" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.080166 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6768fdc1e3a69a38649dbdd2988a9048c0bbe9a43161d86f52729bad423a772b"} err="failed to get container status \"6768fdc1e3a69a38649dbdd2988a9048c0bbe9a43161d86f52729bad423a772b\": rpc error: code = NotFound desc = could not find container \"6768fdc1e3a69a38649dbdd2988a9048c0bbe9a43161d86f52729bad423a772b\": container with ID starting with 6768fdc1e3a69a38649dbdd2988a9048c0bbe9a43161d86f52729bad423a772b not found: ID does not exist" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.080188 4841 scope.go:117] "RemoveContainer" containerID="655b3e60486efdf227c8cac3f17c62922982a75847f0dc7f0d61eff49d343657" Mar 13 09:57:04 crc kubenswrapper[4841]: E0313 09:57:04.080427 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"655b3e60486efdf227c8cac3f17c62922982a75847f0dc7f0d61eff49d343657\": container with ID starting with 655b3e60486efdf227c8cac3f17c62922982a75847f0dc7f0d61eff49d343657 not found: ID does not exist" containerID="655b3e60486efdf227c8cac3f17c62922982a75847f0dc7f0d61eff49d343657" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.080454 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"655b3e60486efdf227c8cac3f17c62922982a75847f0dc7f0d61eff49d343657"} err="failed to get container status \"655b3e60486efdf227c8cac3f17c62922982a75847f0dc7f0d61eff49d343657\": rpc error: code = NotFound desc = could not find container \"655b3e60486efdf227c8cac3f17c62922982a75847f0dc7f0d61eff49d343657\": container with ID starting with 655b3e60486efdf227c8cac3f17c62922982a75847f0dc7f0d61eff49d343657 not found: ID does not exist" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.333579 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7vxw" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.435219 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9-catalog-content\") pod \"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9\" (UID: \"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9\") " Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.435652 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gqwn\" (UniqueName: \"kubernetes.io/projected/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9-kube-api-access-9gqwn\") pod \"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9\" (UID: \"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9\") " Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.435998 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9-utilities\") pod \"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9\" (UID: \"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9\") " Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.436639 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9-utilities" (OuterVolumeSpecName: "utilities") pod "e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9" (UID: "e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.440982 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9-kube-api-access-9gqwn" (OuterVolumeSpecName: "kube-api-access-9gqwn") pod "e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9" (UID: "e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9"). InnerVolumeSpecName "kube-api-access-9gqwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.539588 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.539766 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gqwn\" (UniqueName: \"kubernetes.io/projected/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9-kube-api-access-9gqwn\") on node \"crc\" DevicePath \"\"" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.564606 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9" (UID: "e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.641796 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.810924 4841 scope.go:117] "RemoveContainer" containerID="365c91209487e7b880cac1843eff8258eea0bd4895e662608043e84e222444c3" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.856497 4841 generic.go:334] "Generic (PLEG): container finished" podID="e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9" containerID="18f48dd0e1840dde04063ba0e444962bf0334d672d1fc9e64e88dadd0d28708a" exitCode=0 Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.856570 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7vxw" event={"ID":"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9","Type":"ContainerDied","Data":"18f48dd0e1840dde04063ba0e444962bf0334d672d1fc9e64e88dadd0d28708a"} Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.856603 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j7vxw" event={"ID":"e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9","Type":"ContainerDied","Data":"fc4c915a02e204372f8f09693d1506a9beda87703c5c765c3ca4b987fb25ef92"} Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.856608 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j7vxw" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.856625 4841 scope.go:117] "RemoveContainer" containerID="18f48dd0e1840dde04063ba0e444962bf0334d672d1fc9e64e88dadd0d28708a" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.892849 4841 scope.go:117] "RemoveContainer" containerID="f89384e7d8062d771bd52b456f744df0150e7bb171362db8c3f4a6af699df42d" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.897901 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j7vxw"] Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.910421 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j7vxw"] Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.928054 4841 scope.go:117] "RemoveContainer" containerID="7e5e86317c5a78a73a2a2beec151747ed1c4d8bf04941670d591d178e70517be" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.946118 4841 scope.go:117] "RemoveContainer" containerID="18f48dd0e1840dde04063ba0e444962bf0334d672d1fc9e64e88dadd0d28708a" Mar 13 09:57:04 crc kubenswrapper[4841]: E0313 09:57:04.946541 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18f48dd0e1840dde04063ba0e444962bf0334d672d1fc9e64e88dadd0d28708a\": container with ID starting with 18f48dd0e1840dde04063ba0e444962bf0334d672d1fc9e64e88dadd0d28708a not found: ID does not exist" containerID="18f48dd0e1840dde04063ba0e444962bf0334d672d1fc9e64e88dadd0d28708a" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.946587 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18f48dd0e1840dde04063ba0e444962bf0334d672d1fc9e64e88dadd0d28708a"} err="failed to get container status \"18f48dd0e1840dde04063ba0e444962bf0334d672d1fc9e64e88dadd0d28708a\": rpc error: code = NotFound desc = could not find container \"18f48dd0e1840dde04063ba0e444962bf0334d672d1fc9e64e88dadd0d28708a\": container with ID starting with 18f48dd0e1840dde04063ba0e444962bf0334d672d1fc9e64e88dadd0d28708a not found: ID does not exist" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.946611 4841 scope.go:117] "RemoveContainer" containerID="f89384e7d8062d771bd52b456f744df0150e7bb171362db8c3f4a6af699df42d" Mar 13 09:57:04 crc kubenswrapper[4841]: E0313 09:57:04.946991 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f89384e7d8062d771bd52b456f744df0150e7bb171362db8c3f4a6af699df42d\": container with ID starting with f89384e7d8062d771bd52b456f744df0150e7bb171362db8c3f4a6af699df42d not found: ID does not exist" containerID="f89384e7d8062d771bd52b456f744df0150e7bb171362db8c3f4a6af699df42d" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.947026 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f89384e7d8062d771bd52b456f744df0150e7bb171362db8c3f4a6af699df42d"} err="failed to get container status \"f89384e7d8062d771bd52b456f744df0150e7bb171362db8c3f4a6af699df42d\": rpc error: code = NotFound desc = could not find container \"f89384e7d8062d771bd52b456f744df0150e7bb171362db8c3f4a6af699df42d\": container with ID starting with f89384e7d8062d771bd52b456f744df0150e7bb171362db8c3f4a6af699df42d not found: ID does not exist" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.947051 4841 scope.go:117] "RemoveContainer" containerID="7e5e86317c5a78a73a2a2beec151747ed1c4d8bf04941670d591d178e70517be" Mar 13 09:57:04 crc kubenswrapper[4841]: E0313 09:57:04.947372 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e5e86317c5a78a73a2a2beec151747ed1c4d8bf04941670d591d178e70517be\": container with ID starting with 7e5e86317c5a78a73a2a2beec151747ed1c4d8bf04941670d591d178e70517be not found: ID does not exist" containerID="7e5e86317c5a78a73a2a2beec151747ed1c4d8bf04941670d591d178e70517be" Mar 13 09:57:04 crc kubenswrapper[4841]: I0313 09:57:04.947395 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e5e86317c5a78a73a2a2beec151747ed1c4d8bf04941670d591d178e70517be"} err="failed to get container status \"7e5e86317c5a78a73a2a2beec151747ed1c4d8bf04941670d591d178e70517be\": rpc error: code = NotFound desc = could not find container \"7e5e86317c5a78a73a2a2beec151747ed1c4d8bf04941670d591d178e70517be\": container with ID starting with 7e5e86317c5a78a73a2a2beec151747ed1c4d8bf04941670d591d178e70517be not found: ID does not exist" Mar 13 09:57:06 crc kubenswrapper[4841]: I0313 09:57:06.004825 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9" path="/var/lib/kubelet/pods/e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9/volumes" Mar 13 09:57:34 crc kubenswrapper[4841]: I0313 09:57:34.407534 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:57:34 crc kubenswrapper[4841]: I0313 09:57:34.408045 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:57:37 crc kubenswrapper[4841]: I0313 09:57:37.791564 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cjvml"] Mar 13 09:57:37 crc kubenswrapper[4841]: E0313 09:57:37.792313 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e04663b3-d1aa-40e2-b936-c26eb830a33a" containerName="registry-server" Mar 13 09:57:37 crc kubenswrapper[4841]: I0313 09:57:37.792330 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e04663b3-d1aa-40e2-b936-c26eb830a33a" containerName="registry-server" Mar 13 09:57:37 crc kubenswrapper[4841]: E0313 09:57:37.792352 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9" containerName="registry-server" Mar 13 09:57:37 crc kubenswrapper[4841]: I0313 09:57:37.792360 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9" containerName="registry-server" Mar 13 09:57:37 crc kubenswrapper[4841]: E0313 09:57:37.792378 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9" containerName="extract-content" Mar 13 09:57:37 crc kubenswrapper[4841]: I0313 09:57:37.792386 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9" containerName="extract-content" Mar 13 09:57:37 crc kubenswrapper[4841]: E0313 09:57:37.792403 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e04663b3-d1aa-40e2-b936-c26eb830a33a" containerName="extract-utilities" Mar 13 09:57:37 crc kubenswrapper[4841]: I0313 09:57:37.792410 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e04663b3-d1aa-40e2-b936-c26eb830a33a" containerName="extract-utilities" Mar 13 09:57:37 crc kubenswrapper[4841]: E0313 09:57:37.792419 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e04663b3-d1aa-40e2-b936-c26eb830a33a" containerName="extract-content" Mar 13 09:57:37 crc kubenswrapper[4841]: I0313 09:57:37.792427 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e04663b3-d1aa-40e2-b936-c26eb830a33a" containerName="extract-content" Mar 13 09:57:37 crc kubenswrapper[4841]: E0313 09:57:37.792451 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9" containerName="extract-utilities" Mar 13 09:57:37 crc kubenswrapper[4841]: I0313 09:57:37.792460 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9" containerName="extract-utilities" Mar 13 09:57:37 crc kubenswrapper[4841]: I0313 09:57:37.792679 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6cc0f32-ca87-4885-b06b-ef10b6d6f3e9" containerName="registry-server" Mar 13 09:57:37 crc kubenswrapper[4841]: I0313 09:57:37.792699 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e04663b3-d1aa-40e2-b936-c26eb830a33a" containerName="registry-server" Mar 13 09:57:37 crc kubenswrapper[4841]: I0313 09:57:37.794207 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjvml" Mar 13 09:57:37 crc kubenswrapper[4841]: I0313 09:57:37.830685 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cjvml"] Mar 13 09:57:37 crc kubenswrapper[4841]: I0313 09:57:37.924027 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238-catalog-content\") pod \"certified-operators-cjvml\" (UID: \"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238\") " pod="openshift-marketplace/certified-operators-cjvml" Mar 13 09:57:37 crc kubenswrapper[4841]: I0313 09:57:37.924208 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27zjm\" (UniqueName: \"kubernetes.io/projected/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238-kube-api-access-27zjm\") pod \"certified-operators-cjvml\" (UID: \"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238\") " pod="openshift-marketplace/certified-operators-cjvml" Mar 13 09:57:37 crc kubenswrapper[4841]: I0313 09:57:37.924295 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238-utilities\") pod \"certified-operators-cjvml\" (UID: \"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238\") " pod="openshift-marketplace/certified-operators-cjvml" Mar 13 09:57:38 crc kubenswrapper[4841]: I0313 09:57:38.026515 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27zjm\" (UniqueName: \"kubernetes.io/projected/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238-kube-api-access-27zjm\") pod \"certified-operators-cjvml\" (UID: \"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238\") " pod="openshift-marketplace/certified-operators-cjvml" Mar 13 09:57:38 crc kubenswrapper[4841]: I0313 09:57:38.026560 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238-utilities\") pod \"certified-operators-cjvml\" (UID: \"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238\") " pod="openshift-marketplace/certified-operators-cjvml" Mar 13 09:57:38 crc kubenswrapper[4841]: I0313 09:57:38.026684 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238-catalog-content\") pod \"certified-operators-cjvml\" (UID: \"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238\") " pod="openshift-marketplace/certified-operators-cjvml" Mar 13 09:57:38 crc kubenswrapper[4841]: I0313 09:57:38.027273 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238-utilities\") pod \"certified-operators-cjvml\" (UID: \"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238\") " pod="openshift-marketplace/certified-operators-cjvml" Mar 13 09:57:38 crc kubenswrapper[4841]: I0313 09:57:38.027307 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238-catalog-content\") pod \"certified-operators-cjvml\" (UID: \"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238\") " pod="openshift-marketplace/certified-operators-cjvml" Mar 13 09:57:38 crc kubenswrapper[4841]: I0313 09:57:38.046340 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27zjm\" (UniqueName: \"kubernetes.io/projected/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238-kube-api-access-27zjm\") pod \"certified-operators-cjvml\" (UID: \"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238\") " pod="openshift-marketplace/certified-operators-cjvml" Mar 13 09:57:38 crc kubenswrapper[4841]: I0313 09:57:38.120482 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjvml" Mar 13 09:57:38 crc kubenswrapper[4841]: I0313 09:57:38.587149 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cjvml"] Mar 13 09:57:39 crc kubenswrapper[4841]: I0313 09:57:39.252652 4841 generic.go:334] "Generic (PLEG): container finished" podID="7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238" containerID="a6bc8fd7946dcf37ae6566009bc64a7d9c1d57970d5ae51ccb3bdb1991e648f6" exitCode=0 Mar 13 09:57:39 crc kubenswrapper[4841]: I0313 09:57:39.253450 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjvml" event={"ID":"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238","Type":"ContainerDied","Data":"a6bc8fd7946dcf37ae6566009bc64a7d9c1d57970d5ae51ccb3bdb1991e648f6"} Mar 13 09:57:39 crc kubenswrapper[4841]: I0313 09:57:39.254690 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjvml" event={"ID":"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238","Type":"ContainerStarted","Data":"fa7bdab29af82e97663b339fd6f52b3732d8686af37184b0b3558ffc6c0f5c2b"} Mar 13 09:57:41 crc kubenswrapper[4841]: I0313 09:57:41.281639 4841 generic.go:334] "Generic (PLEG): container finished" podID="7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238" containerID="41307615e1fc40d3877035034cc69a1b1d517b0d57d2192f59cd5485331766ef" exitCode=0 Mar 13 09:57:41 crc kubenswrapper[4841]: I0313 09:57:41.281722 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjvml" event={"ID":"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238","Type":"ContainerDied","Data":"41307615e1fc40d3877035034cc69a1b1d517b0d57d2192f59cd5485331766ef"} Mar 13 09:57:42 crc kubenswrapper[4841]: I0313 09:57:42.299611 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjvml" event={"ID":"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238","Type":"ContainerStarted","Data":"78476a9654be4e1fa7969c7d0a86cad933856c1e9a7a00de32ee68e223d7779e"} Mar 13 09:57:42 crc kubenswrapper[4841]: I0313 09:57:42.331349 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cjvml" podStartSLOduration=2.84360115 podStartE2EDuration="5.331318252s" podCreationTimestamp="2026-03-13 09:57:37 +0000 UTC" firstStartedPulling="2026-03-13 09:57:39.255289046 +0000 UTC m=+2741.985189237" lastFinishedPulling="2026-03-13 09:57:41.743006148 +0000 UTC m=+2744.472906339" observedRunningTime="2026-03-13 09:57:42.321326389 +0000 UTC m=+2745.051226580" watchObservedRunningTime="2026-03-13 09:57:42.331318252 +0000 UTC m=+2745.061218453" Mar 13 09:57:48 crc kubenswrapper[4841]: I0313 09:57:48.121469 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cjvml" Mar 13 09:57:48 crc kubenswrapper[4841]: I0313 09:57:48.122129 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cjvml" Mar 13 09:57:48 crc kubenswrapper[4841]: I0313 09:57:48.191337 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cjvml" Mar 13 09:57:48 crc kubenswrapper[4841]: I0313 09:57:48.460304 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cjvml" Mar 13 09:57:48 crc kubenswrapper[4841]: I0313 09:57:48.506187 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cjvml"] Mar 13 09:57:50 crc kubenswrapper[4841]: I0313 09:57:50.380382 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cjvml" podUID="7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238" containerName="registry-server" containerID="cri-o://78476a9654be4e1fa7969c7d0a86cad933856c1e9a7a00de32ee68e223d7779e" gracePeriod=2 Mar 13 09:57:50 crc kubenswrapper[4841]: I0313 09:57:50.895370 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjvml" Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.028538 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238-catalog-content\") pod \"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238\" (UID: \"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238\") " Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.028597 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27zjm\" (UniqueName: \"kubernetes.io/projected/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238-kube-api-access-27zjm\") pod \"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238\" (UID: \"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238\") " Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.028843 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238-utilities\") pod \"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238\" (UID: \"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238\") " Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.029867 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238-utilities" (OuterVolumeSpecName: "utilities") pod "7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238" (UID: "7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.038144 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238-kube-api-access-27zjm" (OuterVolumeSpecName: "kube-api-access-27zjm") pod "7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238" (UID: "7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238"). InnerVolumeSpecName "kube-api-access-27zjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.131218 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.131255 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27zjm\" (UniqueName: \"kubernetes.io/projected/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238-kube-api-access-27zjm\") on node \"crc\" DevicePath \"\"" Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.370652 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238" (UID: "7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.395326 4841 generic.go:334] "Generic (PLEG): container finished" podID="7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238" containerID="78476a9654be4e1fa7969c7d0a86cad933856c1e9a7a00de32ee68e223d7779e" exitCode=0 Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.395409 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjvml" event={"ID":"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238","Type":"ContainerDied","Data":"78476a9654be4e1fa7969c7d0a86cad933856c1e9a7a00de32ee68e223d7779e"} Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.395440 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjvml" Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.395461 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjvml" event={"ID":"7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238","Type":"ContainerDied","Data":"fa7bdab29af82e97663b339fd6f52b3732d8686af37184b0b3558ffc6c0f5c2b"} Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.395502 4841 scope.go:117] "RemoveContainer" containerID="78476a9654be4e1fa7969c7d0a86cad933856c1e9a7a00de32ee68e223d7779e" Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.439052 4841 scope.go:117] "RemoveContainer" containerID="41307615e1fc40d3877035034cc69a1b1d517b0d57d2192f59cd5485331766ef" Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.439556 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.460112 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cjvml"] Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.471427 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cjvml"] Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.501697 4841 scope.go:117] "RemoveContainer" containerID="a6bc8fd7946dcf37ae6566009bc64a7d9c1d57970d5ae51ccb3bdb1991e648f6" Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.543019 4841 scope.go:117] "RemoveContainer" containerID="78476a9654be4e1fa7969c7d0a86cad933856c1e9a7a00de32ee68e223d7779e" Mar 13 09:57:51 crc kubenswrapper[4841]: E0313 09:57:51.543697 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78476a9654be4e1fa7969c7d0a86cad933856c1e9a7a00de32ee68e223d7779e\": container with ID starting with 78476a9654be4e1fa7969c7d0a86cad933856c1e9a7a00de32ee68e223d7779e not found: ID does not exist" containerID="78476a9654be4e1fa7969c7d0a86cad933856c1e9a7a00de32ee68e223d7779e" Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.543727 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78476a9654be4e1fa7969c7d0a86cad933856c1e9a7a00de32ee68e223d7779e"} err="failed to get container status \"78476a9654be4e1fa7969c7d0a86cad933856c1e9a7a00de32ee68e223d7779e\": rpc error: code = NotFound desc = could not find container \"78476a9654be4e1fa7969c7d0a86cad933856c1e9a7a00de32ee68e223d7779e\": container with ID starting with 78476a9654be4e1fa7969c7d0a86cad933856c1e9a7a00de32ee68e223d7779e not found: ID does not exist" Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.543749 4841 scope.go:117] "RemoveContainer" containerID="41307615e1fc40d3877035034cc69a1b1d517b0d57d2192f59cd5485331766ef" Mar 13 09:57:51 crc kubenswrapper[4841]: E0313 09:57:51.544498 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41307615e1fc40d3877035034cc69a1b1d517b0d57d2192f59cd5485331766ef\": container with ID starting with 41307615e1fc40d3877035034cc69a1b1d517b0d57d2192f59cd5485331766ef not found: ID does not exist" containerID="41307615e1fc40d3877035034cc69a1b1d517b0d57d2192f59cd5485331766ef" Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.544520 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41307615e1fc40d3877035034cc69a1b1d517b0d57d2192f59cd5485331766ef"} err="failed to get container status \"41307615e1fc40d3877035034cc69a1b1d517b0d57d2192f59cd5485331766ef\": rpc error: code = NotFound desc = could not find container \"41307615e1fc40d3877035034cc69a1b1d517b0d57d2192f59cd5485331766ef\": container with ID starting with 41307615e1fc40d3877035034cc69a1b1d517b0d57d2192f59cd5485331766ef not found: ID does not exist" Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.544532 4841 scope.go:117] "RemoveContainer" containerID="a6bc8fd7946dcf37ae6566009bc64a7d9c1d57970d5ae51ccb3bdb1991e648f6" Mar 13 09:57:51 crc kubenswrapper[4841]: E0313 09:57:51.545018 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6bc8fd7946dcf37ae6566009bc64a7d9c1d57970d5ae51ccb3bdb1991e648f6\": container with ID starting with a6bc8fd7946dcf37ae6566009bc64a7d9c1d57970d5ae51ccb3bdb1991e648f6 not found: ID does not exist" containerID="a6bc8fd7946dcf37ae6566009bc64a7d9c1d57970d5ae51ccb3bdb1991e648f6" Mar 13 09:57:51 crc kubenswrapper[4841]: I0313 09:57:51.545035 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6bc8fd7946dcf37ae6566009bc64a7d9c1d57970d5ae51ccb3bdb1991e648f6"} err="failed to get container status \"a6bc8fd7946dcf37ae6566009bc64a7d9c1d57970d5ae51ccb3bdb1991e648f6\": rpc error: code = NotFound desc = could not find container \"a6bc8fd7946dcf37ae6566009bc64a7d9c1d57970d5ae51ccb3bdb1991e648f6\": container with ID starting with a6bc8fd7946dcf37ae6566009bc64a7d9c1d57970d5ae51ccb3bdb1991e648f6 not found: ID does not exist" Mar 13 09:57:52 crc kubenswrapper[4841]: I0313 09:57:52.015124 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238" path="/var/lib/kubelet/pods/7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238/volumes" Mar 13 09:58:00 crc kubenswrapper[4841]: I0313 09:58:00.160373 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556598-r26f7"] Mar 13 09:58:00 crc kubenswrapper[4841]: E0313 09:58:00.162008 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238" containerName="registry-server" Mar 13 09:58:00 crc kubenswrapper[4841]: I0313 09:58:00.162028 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238" containerName="registry-server" Mar 13 09:58:00 crc kubenswrapper[4841]: E0313 09:58:00.162395 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238" containerName="extract-content" Mar 13 09:58:00 crc kubenswrapper[4841]: I0313 09:58:00.162463 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238" containerName="extract-content" Mar 13 09:58:00 crc kubenswrapper[4841]: E0313 09:58:00.162520 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238" containerName="extract-utilities" Mar 13 09:58:00 crc kubenswrapper[4841]: I0313 09:58:00.162545 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238" containerName="extract-utilities" Mar 13 09:58:00 crc kubenswrapper[4841]: I0313 09:58:00.163194 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c2e6869-2ac4-4f01-b1ec-fbd8d0d93238" containerName="registry-server" Mar 13 09:58:00 crc kubenswrapper[4841]: I0313 09:58:00.164153 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556598-r26f7" Mar 13 09:58:00 crc kubenswrapper[4841]: I0313 09:58:00.167816 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 09:58:00 crc kubenswrapper[4841]: I0313 09:58:00.168055 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 09:58:00 crc kubenswrapper[4841]: I0313 09:58:00.168063 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 09:58:00 crc kubenswrapper[4841]: I0313 09:58:00.186159 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556598-r26f7"] Mar 13 09:58:00 crc kubenswrapper[4841]: I0313 09:58:00.319251 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk9rp\" (UniqueName: \"kubernetes.io/projected/c91b4ea0-e020-4088-ad70-aa8b6d6a7c15-kube-api-access-xk9rp\") pod \"auto-csr-approver-29556598-r26f7\" (UID: \"c91b4ea0-e020-4088-ad70-aa8b6d6a7c15\") " pod="openshift-infra/auto-csr-approver-29556598-r26f7" Mar 13 09:58:00 crc kubenswrapper[4841]: I0313 09:58:00.421013 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk9rp\" (UniqueName: \"kubernetes.io/projected/c91b4ea0-e020-4088-ad70-aa8b6d6a7c15-kube-api-access-xk9rp\") pod \"auto-csr-approver-29556598-r26f7\" (UID: \"c91b4ea0-e020-4088-ad70-aa8b6d6a7c15\") " pod="openshift-infra/auto-csr-approver-29556598-r26f7" Mar 13 09:58:00 crc kubenswrapper[4841]: I0313 09:58:00.447338 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk9rp\" (UniqueName: \"kubernetes.io/projected/c91b4ea0-e020-4088-ad70-aa8b6d6a7c15-kube-api-access-xk9rp\") pod \"auto-csr-approver-29556598-r26f7\" (UID: \"c91b4ea0-e020-4088-ad70-aa8b6d6a7c15\") " pod="openshift-infra/auto-csr-approver-29556598-r26f7" Mar 13 09:58:00 crc kubenswrapper[4841]: I0313 09:58:00.484356 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556598-r26f7" Mar 13 09:58:00 crc kubenswrapper[4841]: I0313 09:58:00.965414 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556598-r26f7"] Mar 13 09:58:01 crc kubenswrapper[4841]: I0313 09:58:01.495974 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556598-r26f7" event={"ID":"c91b4ea0-e020-4088-ad70-aa8b6d6a7c15","Type":"ContainerStarted","Data":"fc8b5d6ba6c2b0a92544edc26e5e555da00c297dc7e579c521628af38ec740d8"} Mar 13 09:58:02 crc kubenswrapper[4841]: I0313 09:58:02.509498 4841 generic.go:334] "Generic (PLEG): container finished" podID="c91b4ea0-e020-4088-ad70-aa8b6d6a7c15" containerID="be07db9d729e7d306a78aac9d35d087678f75114132b0f88a705b1aa5d17a28a" exitCode=0 Mar 13 09:58:02 crc kubenswrapper[4841]: I0313 09:58:02.509603 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556598-r26f7" event={"ID":"c91b4ea0-e020-4088-ad70-aa8b6d6a7c15","Type":"ContainerDied","Data":"be07db9d729e7d306a78aac9d35d087678f75114132b0f88a705b1aa5d17a28a"} Mar 13 09:58:03 crc kubenswrapper[4841]: I0313 09:58:03.815749 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556598-r26f7" Mar 13 09:58:03 crc kubenswrapper[4841]: I0313 09:58:03.889241 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk9rp\" (UniqueName: \"kubernetes.io/projected/c91b4ea0-e020-4088-ad70-aa8b6d6a7c15-kube-api-access-xk9rp\") pod \"c91b4ea0-e020-4088-ad70-aa8b6d6a7c15\" (UID: \"c91b4ea0-e020-4088-ad70-aa8b6d6a7c15\") " Mar 13 09:58:03 crc kubenswrapper[4841]: I0313 09:58:03.896604 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c91b4ea0-e020-4088-ad70-aa8b6d6a7c15-kube-api-access-xk9rp" (OuterVolumeSpecName: "kube-api-access-xk9rp") pod "c91b4ea0-e020-4088-ad70-aa8b6d6a7c15" (UID: "c91b4ea0-e020-4088-ad70-aa8b6d6a7c15"). InnerVolumeSpecName "kube-api-access-xk9rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:58:03 crc kubenswrapper[4841]: I0313 09:58:03.992135 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk9rp\" (UniqueName: \"kubernetes.io/projected/c91b4ea0-e020-4088-ad70-aa8b6d6a7c15-kube-api-access-xk9rp\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:04 crc kubenswrapper[4841]: I0313 09:58:04.407180 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:58:04 crc kubenswrapper[4841]: I0313 09:58:04.407298 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:58:04 crc kubenswrapper[4841]: I0313 09:58:04.527535 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556598-r26f7" event={"ID":"c91b4ea0-e020-4088-ad70-aa8b6d6a7c15","Type":"ContainerDied","Data":"fc8b5d6ba6c2b0a92544edc26e5e555da00c297dc7e579c521628af38ec740d8"} Mar 13 09:58:04 crc kubenswrapper[4841]: I0313 09:58:04.527580 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc8b5d6ba6c2b0a92544edc26e5e555da00c297dc7e579c521628af38ec740d8" Mar 13 09:58:04 crc kubenswrapper[4841]: I0313 09:58:04.527585 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556598-r26f7" Mar 13 09:58:04 crc kubenswrapper[4841]: I0313 09:58:04.903885 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556592-25g9w"] Mar 13 09:58:04 crc kubenswrapper[4841]: I0313 09:58:04.916707 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556592-25g9w"] Mar 13 09:58:06 crc kubenswrapper[4841]: I0313 09:58:06.010401 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0591f1c-f727-44d8-96d7-0c0ce0c06d8b" path="/var/lib/kubelet/pods/a0591f1c-f727-44d8-96d7-0c0ce0c06d8b/volumes" Mar 13 09:58:32 crc kubenswrapper[4841]: I0313 09:58:32.706180 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-8485bdb9db-mf5lp_17824e5f-18b3-46c0-910a-56e5529e09c3/manager/0.log" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.407127 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.407436 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.407485 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.408175 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"356a0566b4669fa620204281729f9d4a5c82961594bf8afbfe26e440c8bc1ad1"} pod="openshift-machine-config-operator/machine-config-daemon-h227v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.408232 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" containerID="cri-o://356a0566b4669fa620204281729f9d4a5c82961594bf8afbfe26e440c8bc1ad1" gracePeriod=600 Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.489640 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.489967 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="df3b0fd2-0003-42f1-b746-72231cfad7a0" containerName="openstackclient" containerID="cri-o://21f80dc2ef13fcf4a80b162545cceef4071732964b2ce78b2aa957765d00dd46" gracePeriod=2 Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.497624 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.534376 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 09:58:34 crc kubenswrapper[4841]: E0313 09:58:34.534758 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91b4ea0-e020-4088-ad70-aa8b6d6a7c15" containerName="oc" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.534774 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91b4ea0-e020-4088-ad70-aa8b6d6a7c15" containerName="oc" Mar 13 09:58:34 crc kubenswrapper[4841]: E0313 09:58:34.534790 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df3b0fd2-0003-42f1-b746-72231cfad7a0" containerName="openstackclient" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.534796 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="df3b0fd2-0003-42f1-b746-72231cfad7a0" containerName="openstackclient" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.535004 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c91b4ea0-e020-4088-ad70-aa8b6d6a7c15" containerName="oc" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.535019 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="df3b0fd2-0003-42f1-b746-72231cfad7a0" containerName="openstackclient" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.535612 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.549230 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s662g\" (UniqueName: \"kubernetes.io/projected/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-kube-api-access-s662g\") pod \"openstackclient\" (UID: \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\") " pod="openstack/openstackclient" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.550533 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-openstack-config-secret\") pod \"openstackclient\" (UID: \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\") " pod="openstack/openstackclient" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.550606 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-openstack-config\") pod \"openstackclient\" (UID: \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\") " pod="openstack/openstackclient" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.550690 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-combined-ca-bundle\") pod \"openstackclient\" (UID: \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\") " pod="openstack/openstackclient" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.564394 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.565157 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="df3b0fd2-0003-42f1-b746-72231cfad7a0" podUID="aef80b2d-e277-4d2c-8b3a-a80c19f46e36" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.652566 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-openstack-config-secret\") pod \"openstackclient\" (UID: \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\") " pod="openstack/openstackclient" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.652633 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-openstack-config\") pod \"openstackclient\" (UID: \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\") " pod="openstack/openstackclient" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.652690 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-combined-ca-bundle\") pod \"openstackclient\" (UID: \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\") " pod="openstack/openstackclient" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.652791 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s662g\" (UniqueName: \"kubernetes.io/projected/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-kube-api-access-s662g\") pod \"openstackclient\" (UID: \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\") " pod="openstack/openstackclient" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.653925 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-openstack-config\") pod \"openstackclient\" (UID: \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\") " pod="openstack/openstackclient" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.662960 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-openstack-config-secret\") pod \"openstackclient\" (UID: \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\") " pod="openstack/openstackclient" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.669005 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-combined-ca-bundle\") pod \"openstackclient\" (UID: \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\") " pod="openstack/openstackclient" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.671337 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s662g\" (UniqueName: \"kubernetes.io/projected/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-kube-api-access-s662g\") pod \"openstackclient\" (UID: \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\") " pod="openstack/openstackclient" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.833312 4841 generic.go:334] "Generic (PLEG): container finished" podID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerID="356a0566b4669fa620204281729f9d4a5c82961594bf8afbfe26e440c8bc1ad1" exitCode=0 Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.833367 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerDied","Data":"356a0566b4669fa620204281729f9d4a5c82961594bf8afbfe26e440c8bc1ad1"} Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.833428 4841 scope.go:117] "RemoveContainer" containerID="c9b70d2711a2752c32179f4c51015c37e7f0c38eb5252d07a992ba5528c529b5" Mar 13 09:58:34 crc kubenswrapper[4841]: I0313 09:58:34.881497 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.498476 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 09:58:35 crc kubenswrapper[4841]: W0313 09:58:35.502207 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaef80b2d_e277_4d2c_8b3a_a80c19f46e36.slice/crio-6cc2a5787675aa54af58b82ccac10fb8e185121b30c810615b19beb58d1b1b81 WatchSource:0}: Error finding container 6cc2a5787675aa54af58b82ccac10fb8e185121b30c810615b19beb58d1b1b81: Status 404 returned error can't find the container with id 6cc2a5787675aa54af58b82ccac10fb8e185121b30c810615b19beb58d1b1b81 Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.714203 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-tkl8g"] Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.715649 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-tkl8g" Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.725668 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-tkl8g"] Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.803559 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f-operator-scripts\") pod \"aodh-db-create-tkl8g\" (UID: \"5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f\") " pod="openstack/aodh-db-create-tkl8g" Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.804002 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zslx7\" (UniqueName: \"kubernetes.io/projected/5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f-kube-api-access-zslx7\") pod \"aodh-db-create-tkl8g\" (UID: \"5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f\") " pod="openstack/aodh-db-create-tkl8g" Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.818639 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-7ae0-account-create-update-qtssk"] Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.819906 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-7ae0-account-create-update-qtssk" Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.822129 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.835840 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-7ae0-account-create-update-qtssk"] Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.849430 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"aef80b2d-e277-4d2c-8b3a-a80c19f46e36","Type":"ContainerStarted","Data":"f2a271a45dbbddce4fc52202f72ade12d6c81716de77b7d079e05c30e483fad2"} Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.849502 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"aef80b2d-e277-4d2c-8b3a-a80c19f46e36","Type":"ContainerStarted","Data":"6cc2a5787675aa54af58b82ccac10fb8e185121b30c810615b19beb58d1b1b81"} Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.853077 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b"} Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.871674 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.871653408 podStartE2EDuration="1.871653408s" podCreationTimestamp="2026-03-13 09:58:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 09:58:35.863257724 +0000 UTC m=+2798.593157935" watchObservedRunningTime="2026-03-13 09:58:35.871653408 +0000 UTC m=+2798.601553599" Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.906295 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zslx7\" (UniqueName: \"kubernetes.io/projected/5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f-kube-api-access-zslx7\") pod \"aodh-db-create-tkl8g\" (UID: \"5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f\") " pod="openstack/aodh-db-create-tkl8g" Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.906481 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd7e92b5-60a2-4b89-94c5-97003da1aefe-operator-scripts\") pod \"aodh-7ae0-account-create-update-qtssk\" (UID: \"cd7e92b5-60a2-4b89-94c5-97003da1aefe\") " pod="openstack/aodh-7ae0-account-create-update-qtssk" Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.906588 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f-operator-scripts\") pod \"aodh-db-create-tkl8g\" (UID: \"5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f\") " pod="openstack/aodh-db-create-tkl8g" Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.906653 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhdw5\" (UniqueName: \"kubernetes.io/projected/cd7e92b5-60a2-4b89-94c5-97003da1aefe-kube-api-access-fhdw5\") pod \"aodh-7ae0-account-create-update-qtssk\" (UID: \"cd7e92b5-60a2-4b89-94c5-97003da1aefe\") " pod="openstack/aodh-7ae0-account-create-update-qtssk" Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.910165 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f-operator-scripts\") pod \"aodh-db-create-tkl8g\" (UID: \"5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f\") " pod="openstack/aodh-db-create-tkl8g" Mar 13 09:58:35 crc kubenswrapper[4841]: I0313 09:58:35.929524 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zslx7\" (UniqueName: \"kubernetes.io/projected/5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f-kube-api-access-zslx7\") pod \"aodh-db-create-tkl8g\" (UID: \"5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f\") " pod="openstack/aodh-db-create-tkl8g" Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.008927 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhdw5\" (UniqueName: \"kubernetes.io/projected/cd7e92b5-60a2-4b89-94c5-97003da1aefe-kube-api-access-fhdw5\") pod \"aodh-7ae0-account-create-update-qtssk\" (UID: \"cd7e92b5-60a2-4b89-94c5-97003da1aefe\") " pod="openstack/aodh-7ae0-account-create-update-qtssk" Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.009063 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd7e92b5-60a2-4b89-94c5-97003da1aefe-operator-scripts\") pod \"aodh-7ae0-account-create-update-qtssk\" (UID: \"cd7e92b5-60a2-4b89-94c5-97003da1aefe\") " pod="openstack/aodh-7ae0-account-create-update-qtssk" Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.010476 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd7e92b5-60a2-4b89-94c5-97003da1aefe-operator-scripts\") pod \"aodh-7ae0-account-create-update-qtssk\" (UID: \"cd7e92b5-60a2-4b89-94c5-97003da1aefe\") " pod="openstack/aodh-7ae0-account-create-update-qtssk" Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.033984 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhdw5\" (UniqueName: \"kubernetes.io/projected/cd7e92b5-60a2-4b89-94c5-97003da1aefe-kube-api-access-fhdw5\") pod \"aodh-7ae0-account-create-update-qtssk\" (UID: \"cd7e92b5-60a2-4b89-94c5-97003da1aefe\") " pod="openstack/aodh-7ae0-account-create-update-qtssk" Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.036140 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-tkl8g" Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.143292 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-7ae0-account-create-update-qtssk" Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.589754 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-tkl8g"] Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.662175 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-7ae0-account-create-update-qtssk"] Mar 13 09:58:36 crc kubenswrapper[4841]: W0313 09:58:36.674514 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd7e92b5_60a2_4b89_94c5_97003da1aefe.slice/crio-77f1d2dd1b5c6afe67d9bbc13be37c2ebeaf6fc6424336e97afecc820094c702 WatchSource:0}: Error finding container 77f1d2dd1b5c6afe67d9bbc13be37c2ebeaf6fc6424336e97afecc820094c702: Status 404 returned error can't find the container with id 77f1d2dd1b5c6afe67d9bbc13be37c2ebeaf6fc6424336e97afecc820094c702 Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.863348 4841 generic.go:334] "Generic (PLEG): container finished" podID="df3b0fd2-0003-42f1-b746-72231cfad7a0" containerID="21f80dc2ef13fcf4a80b162545cceef4071732964b2ce78b2aa957765d00dd46" exitCode=137 Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.863423 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c651b40a189421f36bdafcb2c072b348fe0e2235150fc6da68407cf699c9a49" Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.864958 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-7ae0-account-create-update-qtssk" event={"ID":"cd7e92b5-60a2-4b89-94c5-97003da1aefe","Type":"ContainerStarted","Data":"77f1d2dd1b5c6afe67d9bbc13be37c2ebeaf6fc6424336e97afecc820094c702"} Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.866044 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-tkl8g" event={"ID":"5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f","Type":"ContainerStarted","Data":"ed36b486be8505d0bffec427c5948f40b728056d368d8a33efb34e06c2f445a5"} Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.911500 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.927886 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrdsw\" (UniqueName: \"kubernetes.io/projected/df3b0fd2-0003-42f1-b746-72231cfad7a0-kube-api-access-xrdsw\") pod \"df3b0fd2-0003-42f1-b746-72231cfad7a0\" (UID: \"df3b0fd2-0003-42f1-b746-72231cfad7a0\") " Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.928103 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3b0fd2-0003-42f1-b746-72231cfad7a0-combined-ca-bundle\") pod \"df3b0fd2-0003-42f1-b746-72231cfad7a0\" (UID: \"df3b0fd2-0003-42f1-b746-72231cfad7a0\") " Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.928379 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/df3b0fd2-0003-42f1-b746-72231cfad7a0-openstack-config\") pod \"df3b0fd2-0003-42f1-b746-72231cfad7a0\" (UID: \"df3b0fd2-0003-42f1-b746-72231cfad7a0\") " Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.928529 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/df3b0fd2-0003-42f1-b746-72231cfad7a0-openstack-config-secret\") pod \"df3b0fd2-0003-42f1-b746-72231cfad7a0\" (UID: \"df3b0fd2-0003-42f1-b746-72231cfad7a0\") " Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.945783 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df3b0fd2-0003-42f1-b746-72231cfad7a0-kube-api-access-xrdsw" (OuterVolumeSpecName: "kube-api-access-xrdsw") pod "df3b0fd2-0003-42f1-b746-72231cfad7a0" (UID: "df3b0fd2-0003-42f1-b746-72231cfad7a0"). InnerVolumeSpecName "kube-api-access-xrdsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.981917 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df3b0fd2-0003-42f1-b746-72231cfad7a0-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "df3b0fd2-0003-42f1-b746-72231cfad7a0" (UID: "df3b0fd2-0003-42f1-b746-72231cfad7a0"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:58:36 crc kubenswrapper[4841]: I0313 09:58:36.991038 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df3b0fd2-0003-42f1-b746-72231cfad7a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df3b0fd2-0003-42f1-b746-72231cfad7a0" (UID: "df3b0fd2-0003-42f1-b746-72231cfad7a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:58:37 crc kubenswrapper[4841]: I0313 09:58:37.025025 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df3b0fd2-0003-42f1-b746-72231cfad7a0-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "df3b0fd2-0003-42f1-b746-72231cfad7a0" (UID: "df3b0fd2-0003-42f1-b746-72231cfad7a0"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:58:37 crc kubenswrapper[4841]: I0313 09:58:37.031372 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/df3b0fd2-0003-42f1-b746-72231cfad7a0-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:37 crc kubenswrapper[4841]: I0313 09:58:37.031407 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/df3b0fd2-0003-42f1-b746-72231cfad7a0-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:37 crc kubenswrapper[4841]: I0313 09:58:37.031418 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrdsw\" (UniqueName: \"kubernetes.io/projected/df3b0fd2-0003-42f1-b746-72231cfad7a0-kube-api-access-xrdsw\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:37 crc kubenswrapper[4841]: I0313 09:58:37.031427 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3b0fd2-0003-42f1-b746-72231cfad7a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:37 crc kubenswrapper[4841]: I0313 09:58:37.881556 4841 generic.go:334] "Generic (PLEG): container finished" podID="cd7e92b5-60a2-4b89-94c5-97003da1aefe" containerID="08e94ccf8739c2711f5a40db95804705fd6bd2b3b9c7f0516f80f7dc287bdaec" exitCode=0 Mar 13 09:58:37 crc kubenswrapper[4841]: I0313 09:58:37.881670 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-7ae0-account-create-update-qtssk" event={"ID":"cd7e92b5-60a2-4b89-94c5-97003da1aefe","Type":"ContainerDied","Data":"08e94ccf8739c2711f5a40db95804705fd6bd2b3b9c7f0516f80f7dc287bdaec"} Mar 13 09:58:37 crc kubenswrapper[4841]: I0313 09:58:37.885576 4841 generic.go:334] "Generic (PLEG): container finished" podID="5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f" containerID="ae8ba7502a6814dfa6ebe4b4581589e13e4de42a7a2f6eb987cbe547da6f614f" exitCode=0 Mar 13 09:58:37 crc kubenswrapper[4841]: I0313 09:58:37.885669 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-tkl8g" event={"ID":"5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f","Type":"ContainerDied","Data":"ae8ba7502a6814dfa6ebe4b4581589e13e4de42a7a2f6eb987cbe547da6f614f"} Mar 13 09:58:37 crc kubenswrapper[4841]: I0313 09:58:37.885756 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 09:58:37 crc kubenswrapper[4841]: I0313 09:58:37.920193 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="df3b0fd2-0003-42f1-b746-72231cfad7a0" podUID="aef80b2d-e277-4d2c-8b3a-a80c19f46e36" Mar 13 09:58:38 crc kubenswrapper[4841]: I0313 09:58:38.009405 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df3b0fd2-0003-42f1-b746-72231cfad7a0" path="/var/lib/kubelet/pods/df3b0fd2-0003-42f1-b746-72231cfad7a0/volumes" Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.295334 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-tkl8g" Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.301212 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-7ae0-account-create-update-qtssk" Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.478939 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhdw5\" (UniqueName: \"kubernetes.io/projected/cd7e92b5-60a2-4b89-94c5-97003da1aefe-kube-api-access-fhdw5\") pod \"cd7e92b5-60a2-4b89-94c5-97003da1aefe\" (UID: \"cd7e92b5-60a2-4b89-94c5-97003da1aefe\") " Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.479083 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zslx7\" (UniqueName: \"kubernetes.io/projected/5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f-kube-api-access-zslx7\") pod \"5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f\" (UID: \"5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f\") " Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.479121 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd7e92b5-60a2-4b89-94c5-97003da1aefe-operator-scripts\") pod \"cd7e92b5-60a2-4b89-94c5-97003da1aefe\" (UID: \"cd7e92b5-60a2-4b89-94c5-97003da1aefe\") " Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.479207 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f-operator-scripts\") pod \"5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f\" (UID: \"5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f\") " Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.481146 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd7e92b5-60a2-4b89-94c5-97003da1aefe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd7e92b5-60a2-4b89-94c5-97003da1aefe" (UID: "cd7e92b5-60a2-4b89-94c5-97003da1aefe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.481170 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f" (UID: "5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.489197 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd7e92b5-60a2-4b89-94c5-97003da1aefe-kube-api-access-fhdw5" (OuterVolumeSpecName: "kube-api-access-fhdw5") pod "cd7e92b5-60a2-4b89-94c5-97003da1aefe" (UID: "cd7e92b5-60a2-4b89-94c5-97003da1aefe"). InnerVolumeSpecName "kube-api-access-fhdw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.491551 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f-kube-api-access-zslx7" (OuterVolumeSpecName: "kube-api-access-zslx7") pod "5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f" (UID: "5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f"). InnerVolumeSpecName "kube-api-access-zslx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.584419 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhdw5\" (UniqueName: \"kubernetes.io/projected/cd7e92b5-60a2-4b89-94c5-97003da1aefe-kube-api-access-fhdw5\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.584714 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zslx7\" (UniqueName: \"kubernetes.io/projected/5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f-kube-api-access-zslx7\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.584874 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd7e92b5-60a2-4b89-94c5-97003da1aefe-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.585071 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.903328 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-7ae0-account-create-update-qtssk" event={"ID":"cd7e92b5-60a2-4b89-94c5-97003da1aefe","Type":"ContainerDied","Data":"77f1d2dd1b5c6afe67d9bbc13be37c2ebeaf6fc6424336e97afecc820094c702"} Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.903360 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-7ae0-account-create-update-qtssk" Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.903386 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77f1d2dd1b5c6afe67d9bbc13be37c2ebeaf6fc6424336e97afecc820094c702" Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.905197 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-tkl8g" event={"ID":"5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f","Type":"ContainerDied","Data":"ed36b486be8505d0bffec427c5948f40b728056d368d8a33efb34e06c2f445a5"} Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.905236 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed36b486be8505d0bffec427c5948f40b728056d368d8a33efb34e06c2f445a5" Mar 13 09:58:39 crc kubenswrapper[4841]: I0313 09:58:39.905281 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-tkl8g" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.261831 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-fpfgj"] Mar 13 09:58:41 crc kubenswrapper[4841]: E0313 09:58:41.262636 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f" containerName="mariadb-database-create" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.262652 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f" containerName="mariadb-database-create" Mar 13 09:58:41 crc kubenswrapper[4841]: E0313 09:58:41.262676 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd7e92b5-60a2-4b89-94c5-97003da1aefe" containerName="mariadb-account-create-update" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.262684 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd7e92b5-60a2-4b89-94c5-97003da1aefe" containerName="mariadb-account-create-update" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.262855 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f" containerName="mariadb-database-create" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.262873 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd7e92b5-60a2-4b89-94c5-97003da1aefe" containerName="mariadb-account-create-update" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.264389 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fpfgj" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.267763 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.267776 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.269055 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.272448 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-7tx6j" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.275034 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-fpfgj"] Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.318309 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-scripts\") pod \"aodh-db-sync-fpfgj\" (UID: \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\") " pod="openstack/aodh-db-sync-fpfgj" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.318622 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrxb6\" (UniqueName: \"kubernetes.io/projected/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-kube-api-access-rrxb6\") pod \"aodh-db-sync-fpfgj\" (UID: \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\") " pod="openstack/aodh-db-sync-fpfgj" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.318863 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-config-data\") pod \"aodh-db-sync-fpfgj\" (UID: \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\") " pod="openstack/aodh-db-sync-fpfgj" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.318981 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-combined-ca-bundle\") pod \"aodh-db-sync-fpfgj\" (UID: \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\") " pod="openstack/aodh-db-sync-fpfgj" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.420302 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-config-data\") pod \"aodh-db-sync-fpfgj\" (UID: \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\") " pod="openstack/aodh-db-sync-fpfgj" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.420373 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-combined-ca-bundle\") pod \"aodh-db-sync-fpfgj\" (UID: \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\") " pod="openstack/aodh-db-sync-fpfgj" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.420413 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-scripts\") pod \"aodh-db-sync-fpfgj\" (UID: \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\") " pod="openstack/aodh-db-sync-fpfgj" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.420445 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrxb6\" (UniqueName: \"kubernetes.io/projected/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-kube-api-access-rrxb6\") pod \"aodh-db-sync-fpfgj\" (UID: \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\") " pod="openstack/aodh-db-sync-fpfgj" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.425709 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-scripts\") pod \"aodh-db-sync-fpfgj\" (UID: \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\") " pod="openstack/aodh-db-sync-fpfgj" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.426758 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-config-data\") pod \"aodh-db-sync-fpfgj\" (UID: \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\") " pod="openstack/aodh-db-sync-fpfgj" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.437572 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-combined-ca-bundle\") pod \"aodh-db-sync-fpfgj\" (UID: \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\") " pod="openstack/aodh-db-sync-fpfgj" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.443766 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrxb6\" (UniqueName: \"kubernetes.io/projected/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-kube-api-access-rrxb6\") pod \"aodh-db-sync-fpfgj\" (UID: \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\") " pod="openstack/aodh-db-sync-fpfgj" Mar 13 09:58:41 crc kubenswrapper[4841]: I0313 09:58:41.582693 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fpfgj" Mar 13 09:58:42 crc kubenswrapper[4841]: I0313 09:58:42.055405 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-fpfgj"] Mar 13 09:58:42 crc kubenswrapper[4841]: I0313 09:58:42.934006 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fpfgj" event={"ID":"e882fdd9-3b5a-4835-9a63-239f15ce9ea1","Type":"ContainerStarted","Data":"36e3e4a0aa114cd37c989ca9510722555daad39687e95c4b8bd6f7530b60c993"} Mar 13 09:58:45 crc kubenswrapper[4841]: I0313 09:58:45.969180 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fpfgj" event={"ID":"e882fdd9-3b5a-4835-9a63-239f15ce9ea1","Type":"ContainerStarted","Data":"c2166581ac140f285937a0942049fbac994c9823d0c6587762c03e29320709a0"} Mar 13 09:58:45 crc kubenswrapper[4841]: I0313 09:58:45.986708 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-fpfgj" podStartSLOduration=1.389255265 podStartE2EDuration="4.986691556s" podCreationTimestamp="2026-03-13 09:58:41 +0000 UTC" firstStartedPulling="2026-03-13 09:58:42.059872173 +0000 UTC m=+2804.789772364" lastFinishedPulling="2026-03-13 09:58:45.657308464 +0000 UTC m=+2808.387208655" observedRunningTime="2026-03-13 09:58:45.985887491 +0000 UTC m=+2808.715787692" watchObservedRunningTime="2026-03-13 09:58:45.986691556 +0000 UTC m=+2808.716591747" Mar 13 09:58:49 crc kubenswrapper[4841]: I0313 09:58:49.012479 4841 generic.go:334] "Generic (PLEG): container finished" podID="e882fdd9-3b5a-4835-9a63-239f15ce9ea1" containerID="c2166581ac140f285937a0942049fbac994c9823d0c6587762c03e29320709a0" exitCode=0 Mar 13 09:58:49 crc kubenswrapper[4841]: I0313 09:58:49.012749 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fpfgj" event={"ID":"e882fdd9-3b5a-4835-9a63-239f15ce9ea1","Type":"ContainerDied","Data":"c2166581ac140f285937a0942049fbac994c9823d0c6587762c03e29320709a0"} Mar 13 09:58:50 crc kubenswrapper[4841]: I0313 09:58:50.339565 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fpfgj" Mar 13 09:58:50 crc kubenswrapper[4841]: I0313 09:58:50.427150 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-config-data\") pod \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\" (UID: \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\") " Mar 13 09:58:50 crc kubenswrapper[4841]: I0313 09:58:50.427242 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-scripts\") pod \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\" (UID: \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\") " Mar 13 09:58:50 crc kubenswrapper[4841]: I0313 09:58:50.427320 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrxb6\" (UniqueName: \"kubernetes.io/projected/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-kube-api-access-rrxb6\") pod \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\" (UID: \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\") " Mar 13 09:58:50 crc kubenswrapper[4841]: I0313 09:58:50.427538 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-combined-ca-bundle\") pod \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\" (UID: \"e882fdd9-3b5a-4835-9a63-239f15ce9ea1\") " Mar 13 09:58:50 crc kubenswrapper[4841]: I0313 09:58:50.435488 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-kube-api-access-rrxb6" (OuterVolumeSpecName: "kube-api-access-rrxb6") pod "e882fdd9-3b5a-4835-9a63-239f15ce9ea1" (UID: "e882fdd9-3b5a-4835-9a63-239f15ce9ea1"). InnerVolumeSpecName "kube-api-access-rrxb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:58:50 crc kubenswrapper[4841]: I0313 09:58:50.443416 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-scripts" (OuterVolumeSpecName: "scripts") pod "e882fdd9-3b5a-4835-9a63-239f15ce9ea1" (UID: "e882fdd9-3b5a-4835-9a63-239f15ce9ea1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:58:50 crc kubenswrapper[4841]: I0313 09:58:50.454041 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-config-data" (OuterVolumeSpecName: "config-data") pod "e882fdd9-3b5a-4835-9a63-239f15ce9ea1" (UID: "e882fdd9-3b5a-4835-9a63-239f15ce9ea1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:58:50 crc kubenswrapper[4841]: I0313 09:58:50.454624 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e882fdd9-3b5a-4835-9a63-239f15ce9ea1" (UID: "e882fdd9-3b5a-4835-9a63-239f15ce9ea1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:58:50 crc kubenswrapper[4841]: I0313 09:58:50.529549 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:50 crc kubenswrapper[4841]: I0313 09:58:50.529581 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:50 crc kubenswrapper[4841]: I0313 09:58:50.529590 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrxb6\" (UniqueName: \"kubernetes.io/projected/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-kube-api-access-rrxb6\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:50 crc kubenswrapper[4841]: I0313 09:58:50.529601 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e882fdd9-3b5a-4835-9a63-239f15ce9ea1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:51 crc kubenswrapper[4841]: I0313 09:58:51.031467 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-fpfgj" event={"ID":"e882fdd9-3b5a-4835-9a63-239f15ce9ea1","Type":"ContainerDied","Data":"36e3e4a0aa114cd37c989ca9510722555daad39687e95c4b8bd6f7530b60c993"} Mar 13 09:58:51 crc kubenswrapper[4841]: I0313 09:58:51.031491 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-fpfgj" Mar 13 09:58:51 crc kubenswrapper[4841]: I0313 09:58:51.031510 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36e3e4a0aa114cd37c989ca9510722555daad39687e95c4b8bd6f7530b60c993" Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.723884 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 13 09:58:55 crc kubenswrapper[4841]: E0313 09:58:55.725099 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e882fdd9-3b5a-4835-9a63-239f15ce9ea1" containerName="aodh-db-sync" Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.725125 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e882fdd9-3b5a-4835-9a63-239f15ce9ea1" containerName="aodh-db-sync" Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.726602 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e882fdd9-3b5a-4835-9a63-239f15ce9ea1" containerName="aodh-db-sync" Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.730043 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.740634 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.741236 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-7tx6j" Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.741553 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.753247 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.838632 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p5fv\" (UniqueName: \"kubernetes.io/projected/699a2f8b-cca8-4679-8794-7aa4e2f23c38-kube-api-access-4p5fv\") pod \"aodh-0\" (UID: \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\") " pod="openstack/aodh-0" Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.838769 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699a2f8b-cca8-4679-8794-7aa4e2f23c38-combined-ca-bundle\") pod \"aodh-0\" (UID: \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\") " pod="openstack/aodh-0" Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.838844 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699a2f8b-cca8-4679-8794-7aa4e2f23c38-config-data\") pod \"aodh-0\" (UID: \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\") " pod="openstack/aodh-0" Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.838884 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/699a2f8b-cca8-4679-8794-7aa4e2f23c38-scripts\") pod \"aodh-0\" (UID: \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\") " pod="openstack/aodh-0" Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.942367 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699a2f8b-cca8-4679-8794-7aa4e2f23c38-config-data\") pod \"aodh-0\" (UID: \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\") " pod="openstack/aodh-0" Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.942440 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/699a2f8b-cca8-4679-8794-7aa4e2f23c38-scripts\") pod \"aodh-0\" (UID: \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\") " pod="openstack/aodh-0" Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.942675 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p5fv\" (UniqueName: \"kubernetes.io/projected/699a2f8b-cca8-4679-8794-7aa4e2f23c38-kube-api-access-4p5fv\") pod \"aodh-0\" (UID: \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\") " pod="openstack/aodh-0" Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.942796 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699a2f8b-cca8-4679-8794-7aa4e2f23c38-combined-ca-bundle\") pod \"aodh-0\" (UID: \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\") " pod="openstack/aodh-0" Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.958196 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699a2f8b-cca8-4679-8794-7aa4e2f23c38-combined-ca-bundle\") pod \"aodh-0\" (UID: \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\") " pod="openstack/aodh-0" Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.961886 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/699a2f8b-cca8-4679-8794-7aa4e2f23c38-scripts\") pod \"aodh-0\" (UID: \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\") " pod="openstack/aodh-0" Mar 13 09:58:55 crc kubenswrapper[4841]: I0313 09:58:55.974253 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699a2f8b-cca8-4679-8794-7aa4e2f23c38-config-data\") pod \"aodh-0\" (UID: \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\") " pod="openstack/aodh-0" Mar 13 09:58:56 crc kubenswrapper[4841]: I0313 09:58:56.002967 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p5fv\" (UniqueName: \"kubernetes.io/projected/699a2f8b-cca8-4679-8794-7aa4e2f23c38-kube-api-access-4p5fv\") pod \"aodh-0\" (UID: \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\") " pod="openstack/aodh-0" Mar 13 09:58:56 crc kubenswrapper[4841]: I0313 09:58:56.050610 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 09:58:56 crc kubenswrapper[4841]: I0313 09:58:56.553039 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 09:58:57 crc kubenswrapper[4841]: I0313 09:58:57.085850 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"699a2f8b-cca8-4679-8794-7aa4e2f23c38","Type":"ContainerStarted","Data":"beb34a3166be53562efbc56c4f800e42f4904f65bf2ccc28ae1f79052d5643b4"} Mar 13 09:58:58 crc kubenswrapper[4841]: I0313 09:58:58.034057 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:58:58 crc kubenswrapper[4841]: I0313 09:58:58.035106 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="198a2488-dbe2-4045-8346-800c44f750f5" containerName="ceilometer-central-agent" containerID="cri-o://d01afbba145f30b14fe565f2a4840decce17f7a9eccd56ecf11f8e95e9c5ed10" gracePeriod=30 Mar 13 09:58:58 crc kubenswrapper[4841]: I0313 09:58:58.035527 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="198a2488-dbe2-4045-8346-800c44f750f5" containerName="proxy-httpd" containerID="cri-o://16649722d35459ff69d48a89727c97d468607dd329e060f36b7739ea916e9ea4" gracePeriod=30 Mar 13 09:58:58 crc kubenswrapper[4841]: I0313 09:58:58.035588 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="198a2488-dbe2-4045-8346-800c44f750f5" containerName="sg-core" containerID="cri-o://2e4097706ffba97714052d8f658e5be9111dce5b385085cca5b96432f508fdeb" gracePeriod=30 Mar 13 09:58:58 crc kubenswrapper[4841]: I0313 09:58:58.035632 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="198a2488-dbe2-4045-8346-800c44f750f5" containerName="ceilometer-notification-agent" containerID="cri-o://55d23c2ce883f55d02551502e6aaa93a91d6c4d5ab1834e34149d797af1e85de" gracePeriod=30 Mar 13 09:58:58 crc kubenswrapper[4841]: I0313 09:58:58.112765 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"699a2f8b-cca8-4679-8794-7aa4e2f23c38","Type":"ContainerStarted","Data":"0b478c5825dcb6fcc7190314eca1c34d4b4aeea11f6989f48b6301e69bab2aa4"} Mar 13 09:58:58 crc kubenswrapper[4841]: I0313 09:58:58.724393 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.125773 4841 generic.go:334] "Generic (PLEG): container finished" podID="198a2488-dbe2-4045-8346-800c44f750f5" containerID="16649722d35459ff69d48a89727c97d468607dd329e060f36b7739ea916e9ea4" exitCode=0 Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.125802 4841 generic.go:334] "Generic (PLEG): container finished" podID="198a2488-dbe2-4045-8346-800c44f750f5" containerID="2e4097706ffba97714052d8f658e5be9111dce5b385085cca5b96432f508fdeb" exitCode=2 Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.125811 4841 generic.go:334] "Generic (PLEG): container finished" podID="198a2488-dbe2-4045-8346-800c44f750f5" containerID="55d23c2ce883f55d02551502e6aaa93a91d6c4d5ab1834e34149d797af1e85de" exitCode=0 Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.125818 4841 generic.go:334] "Generic (PLEG): container finished" podID="198a2488-dbe2-4045-8346-800c44f750f5" containerID="d01afbba145f30b14fe565f2a4840decce17f7a9eccd56ecf11f8e95e9c5ed10" exitCode=0 Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.125837 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"198a2488-dbe2-4045-8346-800c44f750f5","Type":"ContainerDied","Data":"16649722d35459ff69d48a89727c97d468607dd329e060f36b7739ea916e9ea4"} Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.125860 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"198a2488-dbe2-4045-8346-800c44f750f5","Type":"ContainerDied","Data":"2e4097706ffba97714052d8f658e5be9111dce5b385085cca5b96432f508fdeb"} Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.125870 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"198a2488-dbe2-4045-8346-800c44f750f5","Type":"ContainerDied","Data":"55d23c2ce883f55d02551502e6aaa93a91d6c4d5ab1834e34149d797af1e85de"} Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.125879 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"198a2488-dbe2-4045-8346-800c44f750f5","Type":"ContainerDied","Data":"d01afbba145f30b14fe565f2a4840decce17f7a9eccd56ecf11f8e95e9c5ed10"} Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.499999 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.626777 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-combined-ca-bundle\") pod \"198a2488-dbe2-4045-8346-800c44f750f5\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.626930 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-ceilometer-tls-certs\") pod \"198a2488-dbe2-4045-8346-800c44f750f5\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.627079 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198a2488-dbe2-4045-8346-800c44f750f5-run-httpd\") pod \"198a2488-dbe2-4045-8346-800c44f750f5\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.627117 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rj94\" (UniqueName: \"kubernetes.io/projected/198a2488-dbe2-4045-8346-800c44f750f5-kube-api-access-6rj94\") pod \"198a2488-dbe2-4045-8346-800c44f750f5\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.627150 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198a2488-dbe2-4045-8346-800c44f750f5-log-httpd\") pod \"198a2488-dbe2-4045-8346-800c44f750f5\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.627214 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-config-data\") pod \"198a2488-dbe2-4045-8346-800c44f750f5\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.627247 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-sg-core-conf-yaml\") pod \"198a2488-dbe2-4045-8346-800c44f750f5\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.627346 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-scripts\") pod \"198a2488-dbe2-4045-8346-800c44f750f5\" (UID: \"198a2488-dbe2-4045-8346-800c44f750f5\") " Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.628978 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/198a2488-dbe2-4045-8346-800c44f750f5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "198a2488-dbe2-4045-8346-800c44f750f5" (UID: "198a2488-dbe2-4045-8346-800c44f750f5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.629240 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/198a2488-dbe2-4045-8346-800c44f750f5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "198a2488-dbe2-4045-8346-800c44f750f5" (UID: "198a2488-dbe2-4045-8346-800c44f750f5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.634485 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/198a2488-dbe2-4045-8346-800c44f750f5-kube-api-access-6rj94" (OuterVolumeSpecName: "kube-api-access-6rj94") pod "198a2488-dbe2-4045-8346-800c44f750f5" (UID: "198a2488-dbe2-4045-8346-800c44f750f5"). InnerVolumeSpecName "kube-api-access-6rj94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.636449 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-scripts" (OuterVolumeSpecName: "scripts") pod "198a2488-dbe2-4045-8346-800c44f750f5" (UID: "198a2488-dbe2-4045-8346-800c44f750f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.680134 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "198a2488-dbe2-4045-8346-800c44f750f5" (UID: "198a2488-dbe2-4045-8346-800c44f750f5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.707466 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "198a2488-dbe2-4045-8346-800c44f750f5" (UID: "198a2488-dbe2-4045-8346-800c44f750f5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.731001 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.731043 4841 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.731055 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198a2488-dbe2-4045-8346-800c44f750f5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.731066 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rj94\" (UniqueName: \"kubernetes.io/projected/198a2488-dbe2-4045-8346-800c44f750f5-kube-api-access-6rj94\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.731077 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198a2488-dbe2-4045-8346-800c44f750f5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.731087 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.756719 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "198a2488-dbe2-4045-8346-800c44f750f5" (UID: "198a2488-dbe2-4045-8346-800c44f750f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.783577 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-config-data" (OuterVolumeSpecName: "config-data") pod "198a2488-dbe2-4045-8346-800c44f750f5" (UID: "198a2488-dbe2-4045-8346-800c44f750f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.832505 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:58:59 crc kubenswrapper[4841]: I0313 09:58:59.832546 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198a2488-dbe2-4045-8346-800c44f750f5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.138159 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"699a2f8b-cca8-4679-8794-7aa4e2f23c38","Type":"ContainerStarted","Data":"39ebd75d971e14fe50973336a61b1ac2b7b4bf9329dc1fb7ee378171dae859fd"} Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.141415 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"198a2488-dbe2-4045-8346-800c44f750f5","Type":"ContainerDied","Data":"5303981305fff448795c4125dc443c685df4d2db12a547d13c0d80e797c10802"} Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.141467 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.141481 4841 scope.go:117] "RemoveContainer" containerID="16649722d35459ff69d48a89727c97d468607dd329e060f36b7739ea916e9ea4" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.185807 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.188497 4841 scope.go:117] "RemoveContainer" containerID="2e4097706ffba97714052d8f658e5be9111dce5b385085cca5b96432f508fdeb" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.204286 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.222243 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:59:00 crc kubenswrapper[4841]: E0313 09:59:00.222735 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198a2488-dbe2-4045-8346-800c44f750f5" containerName="ceilometer-central-agent" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.222755 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="198a2488-dbe2-4045-8346-800c44f750f5" containerName="ceilometer-central-agent" Mar 13 09:59:00 crc kubenswrapper[4841]: E0313 09:59:00.222788 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198a2488-dbe2-4045-8346-800c44f750f5" containerName="sg-core" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.222796 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="198a2488-dbe2-4045-8346-800c44f750f5" containerName="sg-core" Mar 13 09:59:00 crc kubenswrapper[4841]: E0313 09:59:00.222809 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198a2488-dbe2-4045-8346-800c44f750f5" containerName="ceilometer-notification-agent" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.222817 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="198a2488-dbe2-4045-8346-800c44f750f5" containerName="ceilometer-notification-agent" Mar 13 09:59:00 crc kubenswrapper[4841]: E0313 09:59:00.222834 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198a2488-dbe2-4045-8346-800c44f750f5" containerName="proxy-httpd" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.222841 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="198a2488-dbe2-4045-8346-800c44f750f5" containerName="proxy-httpd" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.223054 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="198a2488-dbe2-4045-8346-800c44f750f5" containerName="sg-core" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.223086 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="198a2488-dbe2-4045-8346-800c44f750f5" containerName="ceilometer-central-agent" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.223110 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="198a2488-dbe2-4045-8346-800c44f750f5" containerName="ceilometer-notification-agent" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.223129 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="198a2488-dbe2-4045-8346-800c44f750f5" containerName="proxy-httpd" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.225155 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.227942 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.228173 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.228378 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.244855 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.245797 4841 scope.go:117] "RemoveContainer" containerID="55d23c2ce883f55d02551502e6aaa93a91d6c4d5ab1834e34149d797af1e85de" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.256376 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:59:00 crc kubenswrapper[4841]: E0313 09:59:00.258196 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-5hlm7 log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-5hlm7 log-httpd run-httpd scripts sg-core-conf-yaml]: context canceled" pod="openstack/ceilometer-0" podUID="958ff077-8beb-4e31-9129-1cb770c47e08" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.289468 4841 scope.go:117] "RemoveContainer" containerID="d01afbba145f30b14fe565f2a4840decce17f7a9eccd56ecf11f8e95e9c5ed10" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.340744 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.340790 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-scripts\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.340879 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hlm7\" (UniqueName: \"kubernetes.io/projected/958ff077-8beb-4e31-9129-1cb770c47e08-kube-api-access-5hlm7\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.340927 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/958ff077-8beb-4e31-9129-1cb770c47e08-log-httpd\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.340964 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.341001 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.341038 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-config-data\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.341090 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/958ff077-8beb-4e31-9129-1cb770c47e08-run-httpd\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.443134 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-config-data\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.443212 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/958ff077-8beb-4e31-9129-1cb770c47e08-run-httpd\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.443323 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.443345 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-scripts\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.443383 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hlm7\" (UniqueName: \"kubernetes.io/projected/958ff077-8beb-4e31-9129-1cb770c47e08-kube-api-access-5hlm7\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.443414 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/958ff077-8beb-4e31-9129-1cb770c47e08-log-httpd\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.443441 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.443470 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.444367 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/958ff077-8beb-4e31-9129-1cb770c47e08-log-httpd\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.444550 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/958ff077-8beb-4e31-9129-1cb770c47e08-run-httpd\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.449748 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.450130 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.450364 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-scripts\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.450375 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.453582 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-config-data\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:00 crc kubenswrapper[4841]: I0313 09:59:00.462390 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hlm7\" (UniqueName: \"kubernetes.io/projected/958ff077-8beb-4e31-9129-1cb770c47e08-kube-api-access-5hlm7\") pod \"ceilometer-0\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " pod="openstack/ceilometer-0" Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.158321 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.161465 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"699a2f8b-cca8-4679-8794-7aa4e2f23c38","Type":"ContainerStarted","Data":"e427227142ab3c76af19b4a0143e5680263d75eea154f07ebae1cae7351c8cf5"} Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.169898 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.257042 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/958ff077-8beb-4e31-9129-1cb770c47e08-run-httpd\") pod \"958ff077-8beb-4e31-9129-1cb770c47e08\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.257358 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/958ff077-8beb-4e31-9129-1cb770c47e08-log-httpd\") pod \"958ff077-8beb-4e31-9129-1cb770c47e08\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.257604 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-ceilometer-tls-certs\") pod \"958ff077-8beb-4e31-9129-1cb770c47e08\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.257516 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/958ff077-8beb-4e31-9129-1cb770c47e08-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "958ff077-8beb-4e31-9129-1cb770c47e08" (UID: "958ff077-8beb-4e31-9129-1cb770c47e08"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.257763 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-config-data\") pod \"958ff077-8beb-4e31-9129-1cb770c47e08\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.257819 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hlm7\" (UniqueName: \"kubernetes.io/projected/958ff077-8beb-4e31-9129-1cb770c47e08-kube-api-access-5hlm7\") pod \"958ff077-8beb-4e31-9129-1cb770c47e08\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.257697 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/958ff077-8beb-4e31-9129-1cb770c47e08-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "958ff077-8beb-4e31-9129-1cb770c47e08" (UID: "958ff077-8beb-4e31-9129-1cb770c47e08"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.257891 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-scripts\") pod \"958ff077-8beb-4e31-9129-1cb770c47e08\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.257963 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-sg-core-conf-yaml\") pod \"958ff077-8beb-4e31-9129-1cb770c47e08\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.257988 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-combined-ca-bundle\") pod \"958ff077-8beb-4e31-9129-1cb770c47e08\" (UID: \"958ff077-8beb-4e31-9129-1cb770c47e08\") " Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.258852 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/958ff077-8beb-4e31-9129-1cb770c47e08-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.258874 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/958ff077-8beb-4e31-9129-1cb770c47e08-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.263118 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "958ff077-8beb-4e31-9129-1cb770c47e08" (UID: "958ff077-8beb-4e31-9129-1cb770c47e08"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.264806 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-config-data" (OuterVolumeSpecName: "config-data") pod "958ff077-8beb-4e31-9129-1cb770c47e08" (UID: "958ff077-8beb-4e31-9129-1cb770c47e08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.266482 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/958ff077-8beb-4e31-9129-1cb770c47e08-kube-api-access-5hlm7" (OuterVolumeSpecName: "kube-api-access-5hlm7") pod "958ff077-8beb-4e31-9129-1cb770c47e08" (UID: "958ff077-8beb-4e31-9129-1cb770c47e08"). InnerVolumeSpecName "kube-api-access-5hlm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.269501 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "958ff077-8beb-4e31-9129-1cb770c47e08" (UID: "958ff077-8beb-4e31-9129-1cb770c47e08"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.270281 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "958ff077-8beb-4e31-9129-1cb770c47e08" (UID: "958ff077-8beb-4e31-9129-1cb770c47e08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.279325 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-scripts" (OuterVolumeSpecName: "scripts") pod "958ff077-8beb-4e31-9129-1cb770c47e08" (UID: "958ff077-8beb-4e31-9129-1cb770c47e08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.360317 4841 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.360356 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.360366 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hlm7\" (UniqueName: \"kubernetes.io/projected/958ff077-8beb-4e31-9129-1cb770c47e08-kube-api-access-5hlm7\") on node \"crc\" DevicePath \"\"" Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.360376 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.360386 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 09:59:01 crc kubenswrapper[4841]: I0313 09:59:01.360394 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/958ff077-8beb-4e31-9129-1cb770c47e08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.018983 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="198a2488-dbe2-4045-8346-800c44f750f5" path="/var/lib/kubelet/pods/198a2488-dbe2-4045-8346-800c44f750f5/volumes" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.168164 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.231559 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.244432 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.255287 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.257625 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.260351 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.260701 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.260832 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.268105 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.390609 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661cdaa4-34e4-47df-9bdb-95d67c012cff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.390673 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/661cdaa4-34e4-47df-9bdb-95d67c012cff-scripts\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.390745 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/661cdaa4-34e4-47df-9bdb-95d67c012cff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.390776 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661cdaa4-34e4-47df-9bdb-95d67c012cff-run-httpd\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.390809 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661cdaa4-34e4-47df-9bdb-95d67c012cff-log-httpd\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.390883 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/661cdaa4-34e4-47df-9bdb-95d67c012cff-config-data\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.390925 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrht2\" (UniqueName: \"kubernetes.io/projected/661cdaa4-34e4-47df-9bdb-95d67c012cff-kube-api-access-vrht2\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.390946 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/661cdaa4-34e4-47df-9bdb-95d67c012cff-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.492469 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661cdaa4-34e4-47df-9bdb-95d67c012cff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.493109 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/661cdaa4-34e4-47df-9bdb-95d67c012cff-scripts\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.493190 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/661cdaa4-34e4-47df-9bdb-95d67c012cff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.493229 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661cdaa4-34e4-47df-9bdb-95d67c012cff-run-httpd\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.493285 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661cdaa4-34e4-47df-9bdb-95d67c012cff-log-httpd\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.493375 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/661cdaa4-34e4-47df-9bdb-95d67c012cff-config-data\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.493417 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrht2\" (UniqueName: \"kubernetes.io/projected/661cdaa4-34e4-47df-9bdb-95d67c012cff-kube-api-access-vrht2\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.493441 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/661cdaa4-34e4-47df-9bdb-95d67c012cff-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.494873 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661cdaa4-34e4-47df-9bdb-95d67c012cff-run-httpd\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.497141 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661cdaa4-34e4-47df-9bdb-95d67c012cff-log-httpd\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.498295 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/661cdaa4-34e4-47df-9bdb-95d67c012cff-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.513155 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661cdaa4-34e4-47df-9bdb-95d67c012cff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.515094 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/661cdaa4-34e4-47df-9bdb-95d67c012cff-config-data\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.515861 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/661cdaa4-34e4-47df-9bdb-95d67c012cff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.532910 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/661cdaa4-34e4-47df-9bdb-95d67c012cff-scripts\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.534975 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrht2\" (UniqueName: \"kubernetes.io/projected/661cdaa4-34e4-47df-9bdb-95d67c012cff-kube-api-access-vrht2\") pod \"ceilometer-0\" (UID: \"661cdaa4-34e4-47df-9bdb-95d67c012cff\") " pod="openstack/ceilometer-0" Mar 13 09:59:02 crc kubenswrapper[4841]: I0313 09:59:02.575493 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 09:59:03 crc kubenswrapper[4841]: I0313 09:59:03.113741 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 09:59:03 crc kubenswrapper[4841]: W0313 09:59:03.122618 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod661cdaa4_34e4_47df_9bdb_95d67c012cff.slice/crio-46e1be5c35e1d9a662d312d91a391b2d55698a1267bf05fffee259d6dfff85c8 WatchSource:0}: Error finding container 46e1be5c35e1d9a662d312d91a391b2d55698a1267bf05fffee259d6dfff85c8: Status 404 returned error can't find the container with id 46e1be5c35e1d9a662d312d91a391b2d55698a1267bf05fffee259d6dfff85c8 Mar 13 09:59:03 crc kubenswrapper[4841]: I0313 09:59:03.180937 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661cdaa4-34e4-47df-9bdb-95d67c012cff","Type":"ContainerStarted","Data":"46e1be5c35e1d9a662d312d91a391b2d55698a1267bf05fffee259d6dfff85c8"} Mar 13 09:59:03 crc kubenswrapper[4841]: I0313 09:59:03.183793 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"699a2f8b-cca8-4679-8794-7aa4e2f23c38","Type":"ContainerStarted","Data":"442e9f5248a03987beb872d3b67f4c02496392208690230da8c92d42455990ed"} Mar 13 09:59:03 crc kubenswrapper[4841]: I0313 09:59:03.183962 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerName="aodh-api" containerID="cri-o://0b478c5825dcb6fcc7190314eca1c34d4b4aeea11f6989f48b6301e69bab2aa4" gracePeriod=30 Mar 13 09:59:03 crc kubenswrapper[4841]: I0313 09:59:03.184018 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerName="aodh-listener" containerID="cri-o://442e9f5248a03987beb872d3b67f4c02496392208690230da8c92d42455990ed" gracePeriod=30 Mar 13 09:59:03 crc kubenswrapper[4841]: I0313 09:59:03.184132 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerName="aodh-evaluator" containerID="cri-o://39ebd75d971e14fe50973336a61b1ac2b7b4bf9329dc1fb7ee378171dae859fd" gracePeriod=30 Mar 13 09:59:03 crc kubenswrapper[4841]: I0313 09:59:03.184113 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerName="aodh-notifier" containerID="cri-o://e427227142ab3c76af19b4a0143e5680263d75eea154f07ebae1cae7351c8cf5" gracePeriod=30 Mar 13 09:59:03 crc kubenswrapper[4841]: I0313 09:59:03.268161 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.336214049 podStartE2EDuration="8.268134836s" podCreationTimestamp="2026-03-13 09:58:55 +0000 UTC" firstStartedPulling="2026-03-13 09:58:56.554256931 +0000 UTC m=+2819.284157122" lastFinishedPulling="2026-03-13 09:59:02.486177718 +0000 UTC m=+2825.216077909" observedRunningTime="2026-03-13 09:59:03.255769287 +0000 UTC m=+2825.985669478" watchObservedRunningTime="2026-03-13 09:59:03.268134836 +0000 UTC m=+2825.998035027" Mar 13 09:59:04 crc kubenswrapper[4841]: I0313 09:59:04.018430 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="958ff077-8beb-4e31-9129-1cb770c47e08" path="/var/lib/kubelet/pods/958ff077-8beb-4e31-9129-1cb770c47e08/volumes" Mar 13 09:59:04 crc kubenswrapper[4841]: I0313 09:59:04.200166 4841 generic.go:334] "Generic (PLEG): container finished" podID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerID="39ebd75d971e14fe50973336a61b1ac2b7b4bf9329dc1fb7ee378171dae859fd" exitCode=0 Mar 13 09:59:04 crc kubenswrapper[4841]: I0313 09:59:04.200221 4841 generic.go:334] "Generic (PLEG): container finished" podID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerID="0b478c5825dcb6fcc7190314eca1c34d4b4aeea11f6989f48b6301e69bab2aa4" exitCode=0 Mar 13 09:59:04 crc kubenswrapper[4841]: I0313 09:59:04.200245 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"699a2f8b-cca8-4679-8794-7aa4e2f23c38","Type":"ContainerDied","Data":"39ebd75d971e14fe50973336a61b1ac2b7b4bf9329dc1fb7ee378171dae859fd"} Mar 13 09:59:04 crc kubenswrapper[4841]: I0313 09:59:04.200341 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"699a2f8b-cca8-4679-8794-7aa4e2f23c38","Type":"ContainerDied","Data":"0b478c5825dcb6fcc7190314eca1c34d4b4aeea11f6989f48b6301e69bab2aa4"} Mar 13 09:59:04 crc kubenswrapper[4841]: I0313 09:59:04.204141 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661cdaa4-34e4-47df-9bdb-95d67c012cff","Type":"ContainerStarted","Data":"6cf64863d58f817720dba31648eee2729c0acadc668594c3d1b289bd92cf009f"} Mar 13 09:59:04 crc kubenswrapper[4841]: I0313 09:59:04.977518 4841 scope.go:117] "RemoveContainer" containerID="a397331b38775f9f7cc6f64c76ff17ce1493bf6541c58d088ba28693bfc2ee8b" Mar 13 09:59:05 crc kubenswrapper[4841]: I0313 09:59:05.021963 4841 scope.go:117] "RemoveContainer" containerID="21f80dc2ef13fcf4a80b162545cceef4071732964b2ce78b2aa957765d00dd46" Mar 13 09:59:05 crc kubenswrapper[4841]: I0313 09:59:05.221083 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661cdaa4-34e4-47df-9bdb-95d67c012cff","Type":"ContainerStarted","Data":"2bb3eb4006b5d8229f05426c429d9e4872c2a2848c7524645469dd876713700a"} Mar 13 09:59:06 crc kubenswrapper[4841]: I0313 09:59:06.231877 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661cdaa4-34e4-47df-9bdb-95d67c012cff","Type":"ContainerStarted","Data":"84608ab55feb813bcdf13437e22e8e6c250a4fe7447dbc33ea352f7fecd41be1"} Mar 13 09:59:08 crc kubenswrapper[4841]: I0313 09:59:08.260794 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661cdaa4-34e4-47df-9bdb-95d67c012cff","Type":"ContainerStarted","Data":"9fc77bd9d1d058726562361e27b29c4a997125f0b3d6065b9c0baa0c1733f239"} Mar 13 09:59:08 crc kubenswrapper[4841]: I0313 09:59:08.262958 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 09:59:08 crc kubenswrapper[4841]: I0313 09:59:08.288858 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8486577149999999 podStartE2EDuration="6.28883384s" podCreationTimestamp="2026-03-13 09:59:02 +0000 UTC" firstStartedPulling="2026-03-13 09:59:03.124880642 +0000 UTC m=+2825.854780833" lastFinishedPulling="2026-03-13 09:59:07.565056767 +0000 UTC m=+2830.294956958" observedRunningTime="2026-03-13 09:59:08.282633796 +0000 UTC m=+2831.012533987" watchObservedRunningTime="2026-03-13 09:59:08.28883384 +0000 UTC m=+2831.018734031" Mar 13 09:59:32 crc kubenswrapper[4841]: I0313 09:59:32.588457 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 09:59:33 crc kubenswrapper[4841]: I0313 09:59:33.512391 4841 generic.go:334] "Generic (PLEG): container finished" podID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerID="442e9f5248a03987beb872d3b67f4c02496392208690230da8c92d42455990ed" exitCode=137 Mar 13 09:59:33 crc kubenswrapper[4841]: I0313 09:59:33.512627 4841 generic.go:334] "Generic (PLEG): container finished" podID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerID="e427227142ab3c76af19b4a0143e5680263d75eea154f07ebae1cae7351c8cf5" exitCode=137 Mar 13 09:59:33 crc kubenswrapper[4841]: I0313 09:59:33.512661 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"699a2f8b-cca8-4679-8794-7aa4e2f23c38","Type":"ContainerDied","Data":"442e9f5248a03987beb872d3b67f4c02496392208690230da8c92d42455990ed"} Mar 13 09:59:33 crc kubenswrapper[4841]: I0313 09:59:33.512686 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"699a2f8b-cca8-4679-8794-7aa4e2f23c38","Type":"ContainerDied","Data":"e427227142ab3c76af19b4a0143e5680263d75eea154f07ebae1cae7351c8cf5"} Mar 13 09:59:33 crc kubenswrapper[4841]: I0313 09:59:33.710795 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 09:59:33 crc kubenswrapper[4841]: I0313 09:59:33.830743 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p5fv\" (UniqueName: \"kubernetes.io/projected/699a2f8b-cca8-4679-8794-7aa4e2f23c38-kube-api-access-4p5fv\") pod \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\" (UID: \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\") " Mar 13 09:59:33 crc kubenswrapper[4841]: I0313 09:59:33.831127 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699a2f8b-cca8-4679-8794-7aa4e2f23c38-combined-ca-bundle\") pod \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\" (UID: \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\") " Mar 13 09:59:33 crc kubenswrapper[4841]: I0313 09:59:33.831223 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699a2f8b-cca8-4679-8794-7aa4e2f23c38-config-data\") pod \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\" (UID: \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\") " Mar 13 09:59:33 crc kubenswrapper[4841]: I0313 09:59:33.831393 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/699a2f8b-cca8-4679-8794-7aa4e2f23c38-scripts\") pod \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\" (UID: \"699a2f8b-cca8-4679-8794-7aa4e2f23c38\") " Mar 13 09:59:33 crc kubenswrapper[4841]: I0313 09:59:33.838722 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/699a2f8b-cca8-4679-8794-7aa4e2f23c38-kube-api-access-4p5fv" (OuterVolumeSpecName: "kube-api-access-4p5fv") pod "699a2f8b-cca8-4679-8794-7aa4e2f23c38" (UID: "699a2f8b-cca8-4679-8794-7aa4e2f23c38"). InnerVolumeSpecName "kube-api-access-4p5fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 09:59:33 crc kubenswrapper[4841]: I0313 09:59:33.881620 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/699a2f8b-cca8-4679-8794-7aa4e2f23c38-scripts" (OuterVolumeSpecName: "scripts") pod "699a2f8b-cca8-4679-8794-7aa4e2f23c38" (UID: "699a2f8b-cca8-4679-8794-7aa4e2f23c38"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:59:33 crc kubenswrapper[4841]: I0313 09:59:33.933508 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p5fv\" (UniqueName: \"kubernetes.io/projected/699a2f8b-cca8-4679-8794-7aa4e2f23c38-kube-api-access-4p5fv\") on node \"crc\" DevicePath \"\"" Mar 13 09:59:33 crc kubenswrapper[4841]: I0313 09:59:33.933537 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/699a2f8b-cca8-4679-8794-7aa4e2f23c38-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 09:59:33 crc kubenswrapper[4841]: I0313 09:59:33.961705 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/699a2f8b-cca8-4679-8794-7aa4e2f23c38-config-data" (OuterVolumeSpecName: "config-data") pod "699a2f8b-cca8-4679-8794-7aa4e2f23c38" (UID: "699a2f8b-cca8-4679-8794-7aa4e2f23c38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:59:33 crc kubenswrapper[4841]: I0313 09:59:33.971552 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/699a2f8b-cca8-4679-8794-7aa4e2f23c38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "699a2f8b-cca8-4679-8794-7aa4e2f23c38" (UID: "699a2f8b-cca8-4679-8794-7aa4e2f23c38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.035632 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699a2f8b-cca8-4679-8794-7aa4e2f23c38-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.035672 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699a2f8b-cca8-4679-8794-7aa4e2f23c38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.523154 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"699a2f8b-cca8-4679-8794-7aa4e2f23c38","Type":"ContainerDied","Data":"beb34a3166be53562efbc56c4f800e42f4904f65bf2ccc28ae1f79052d5643b4"} Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.523217 4841 scope.go:117] "RemoveContainer" containerID="442e9f5248a03987beb872d3b67f4c02496392208690230da8c92d42455990ed" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.523237 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.545330 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.552813 4841 scope.go:117] "RemoveContainer" containerID="e427227142ab3c76af19b4a0143e5680263d75eea154f07ebae1cae7351c8cf5" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.552937 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.572861 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 13 09:59:34 crc kubenswrapper[4841]: E0313 09:59:34.573541 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerName="aodh-listener" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.573560 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerName="aodh-listener" Mar 13 09:59:34 crc kubenswrapper[4841]: E0313 09:59:34.573577 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerName="aodh-notifier" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.573586 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerName="aodh-notifier" Mar 13 09:59:34 crc kubenswrapper[4841]: E0313 09:59:34.573601 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerName="aodh-api" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.573607 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerName="aodh-api" Mar 13 09:59:34 crc kubenswrapper[4841]: E0313 09:59:34.573623 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerName="aodh-evaluator" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.573631 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerName="aodh-evaluator" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.573790 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerName="aodh-evaluator" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.573802 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerName="aodh-notifier" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.573813 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerName="aodh-listener" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.573833 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" containerName="aodh-api" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.575733 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.577953 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.578608 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-7tx6j" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.579236 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.579478 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.583681 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.586032 4841 scope.go:117] "RemoveContainer" containerID="39ebd75d971e14fe50973336a61b1ac2b7b4bf9329dc1fb7ee378171dae859fd" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.586610 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.622717 4841 scope.go:117] "RemoveContainer" containerID="0b478c5825dcb6fcc7190314eca1c34d4b4aeea11f6989f48b6301e69bab2aa4" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.748000 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-scripts\") pod \"aodh-0\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.748126 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqzxd\" (UniqueName: \"kubernetes.io/projected/c4e8e7c7-61ec-4f17-aa35-66780c71735a-kube-api-access-vqzxd\") pod \"aodh-0\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.748174 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-internal-tls-certs\") pod \"aodh-0\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.748237 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.748285 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-config-data\") pod \"aodh-0\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.748337 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-public-tls-certs\") pod \"aodh-0\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.849491 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-scripts\") pod \"aodh-0\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.849540 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqzxd\" (UniqueName: \"kubernetes.io/projected/c4e8e7c7-61ec-4f17-aa35-66780c71735a-kube-api-access-vqzxd\") pod \"aodh-0\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.849566 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-internal-tls-certs\") pod \"aodh-0\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.849606 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.849627 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-config-data\") pod \"aodh-0\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.849660 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-public-tls-certs\") pod \"aodh-0\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.853161 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-internal-tls-certs\") pod \"aodh-0\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.853381 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-scripts\") pod \"aodh-0\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.853890 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.857818 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-public-tls-certs\") pod \"aodh-0\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.858612 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-config-data\") pod \"aodh-0\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.864989 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqzxd\" (UniqueName: \"kubernetes.io/projected/c4e8e7c7-61ec-4f17-aa35-66780c71735a-kube-api-access-vqzxd\") pod \"aodh-0\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " pod="openstack/aodh-0" Mar 13 09:59:34 crc kubenswrapper[4841]: I0313 09:59:34.901081 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 09:59:35 crc kubenswrapper[4841]: I0313 09:59:35.357669 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 09:59:35 crc kubenswrapper[4841]: I0313 09:59:35.533677 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c4e8e7c7-61ec-4f17-aa35-66780c71735a","Type":"ContainerStarted","Data":"6cf0270f2ff8e6ced4f65ed944ebcda21ed221dcc4a4c284f4cd1ec27901f13c"} Mar 13 09:59:36 crc kubenswrapper[4841]: I0313 09:59:36.005924 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="699a2f8b-cca8-4679-8794-7aa4e2f23c38" path="/var/lib/kubelet/pods/699a2f8b-cca8-4679-8794-7aa4e2f23c38/volumes" Mar 13 09:59:36 crc kubenswrapper[4841]: I0313 09:59:36.543965 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c4e8e7c7-61ec-4f17-aa35-66780c71735a","Type":"ContainerStarted","Data":"73835459515e92b6fecc0b2b0b6798f41fedf7faff0780f2db183d4b540ebff6"} Mar 13 09:59:37 crc kubenswrapper[4841]: I0313 09:59:37.556543 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c4e8e7c7-61ec-4f17-aa35-66780c71735a","Type":"ContainerStarted","Data":"c5f39dd7085c62ebf12a9ffc76f9361cec35b3449ce3d5620b567c775aaeebe1"} Mar 13 09:59:37 crc kubenswrapper[4841]: I0313 09:59:37.556869 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c4e8e7c7-61ec-4f17-aa35-66780c71735a","Type":"ContainerStarted","Data":"01148df9a621eb905ab6c92e2b7827e6590aa358e16603c785656acaaee5c402"} Mar 13 09:59:38 crc kubenswrapper[4841]: I0313 09:59:38.567072 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c4e8e7c7-61ec-4f17-aa35-66780c71735a","Type":"ContainerStarted","Data":"d1b17ba81a867d2473cf0859557d2418bfbc4051fa4f1d552488545d4fb5eef7"} Mar 13 09:59:38 crc kubenswrapper[4841]: I0313 09:59:38.588996 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.103031091 podStartE2EDuration="4.588973708s" podCreationTimestamp="2026-03-13 09:59:34 +0000 UTC" firstStartedPulling="2026-03-13 09:59:35.36113328 +0000 UTC m=+2858.091033471" lastFinishedPulling="2026-03-13 09:59:37.847075897 +0000 UTC m=+2860.576976088" observedRunningTime="2026-03-13 09:59:38.583230848 +0000 UTC m=+2861.313131039" watchObservedRunningTime="2026-03-13 09:59:38.588973708 +0000 UTC m=+2861.318873899" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.163041 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556600-mjvph"] Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.173719 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556600-mjvph" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.176099 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.176157 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.177584 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.179911 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556600-mjvph"] Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.249574 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk"] Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.250887 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.252910 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.252911 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.302087 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk"] Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.354226 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42df74eb-bc5c-4ce2-b6ed-fe2297b57e94-secret-volume\") pod \"collect-profiles-29556600-vs4mk\" (UID: \"42df74eb-bc5c-4ce2-b6ed-fe2297b57e94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.354316 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb79d\" (UniqueName: \"kubernetes.io/projected/42df74eb-bc5c-4ce2-b6ed-fe2297b57e94-kube-api-access-lb79d\") pod \"collect-profiles-29556600-vs4mk\" (UID: \"42df74eb-bc5c-4ce2-b6ed-fe2297b57e94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.354370 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bfng\" (UniqueName: \"kubernetes.io/projected/dff89fe5-b7c2-4581-8840-c12adc8826bd-kube-api-access-9bfng\") pod \"auto-csr-approver-29556600-mjvph\" (UID: \"dff89fe5-b7c2-4581-8840-c12adc8826bd\") " pod="openshift-infra/auto-csr-approver-29556600-mjvph" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.354754 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42df74eb-bc5c-4ce2-b6ed-fe2297b57e94-config-volume\") pod \"collect-profiles-29556600-vs4mk\" (UID: \"42df74eb-bc5c-4ce2-b6ed-fe2297b57e94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.456992 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42df74eb-bc5c-4ce2-b6ed-fe2297b57e94-config-volume\") pod \"collect-profiles-29556600-vs4mk\" (UID: \"42df74eb-bc5c-4ce2-b6ed-fe2297b57e94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.457075 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42df74eb-bc5c-4ce2-b6ed-fe2297b57e94-secret-volume\") pod \"collect-profiles-29556600-vs4mk\" (UID: \"42df74eb-bc5c-4ce2-b6ed-fe2297b57e94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.457115 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb79d\" (UniqueName: \"kubernetes.io/projected/42df74eb-bc5c-4ce2-b6ed-fe2297b57e94-kube-api-access-lb79d\") pod \"collect-profiles-29556600-vs4mk\" (UID: \"42df74eb-bc5c-4ce2-b6ed-fe2297b57e94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.457178 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bfng\" (UniqueName: \"kubernetes.io/projected/dff89fe5-b7c2-4581-8840-c12adc8826bd-kube-api-access-9bfng\") pod \"auto-csr-approver-29556600-mjvph\" (UID: \"dff89fe5-b7c2-4581-8840-c12adc8826bd\") " pod="openshift-infra/auto-csr-approver-29556600-mjvph" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.458071 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42df74eb-bc5c-4ce2-b6ed-fe2297b57e94-config-volume\") pod \"collect-profiles-29556600-vs4mk\" (UID: \"42df74eb-bc5c-4ce2-b6ed-fe2297b57e94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.464018 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42df74eb-bc5c-4ce2-b6ed-fe2297b57e94-secret-volume\") pod \"collect-profiles-29556600-vs4mk\" (UID: \"42df74eb-bc5c-4ce2-b6ed-fe2297b57e94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.475968 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bfng\" (UniqueName: \"kubernetes.io/projected/dff89fe5-b7c2-4581-8840-c12adc8826bd-kube-api-access-9bfng\") pod \"auto-csr-approver-29556600-mjvph\" (UID: \"dff89fe5-b7c2-4581-8840-c12adc8826bd\") " pod="openshift-infra/auto-csr-approver-29556600-mjvph" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.478078 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb79d\" (UniqueName: \"kubernetes.io/projected/42df74eb-bc5c-4ce2-b6ed-fe2297b57e94-kube-api-access-lb79d\") pod \"collect-profiles-29556600-vs4mk\" (UID: \"42df74eb-bc5c-4ce2-b6ed-fe2297b57e94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.534964 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556600-mjvph" Mar 13 10:00:00 crc kubenswrapper[4841]: I0313 10:00:00.571095 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk" Mar 13 10:00:01 crc kubenswrapper[4841]: I0313 10:00:01.052808 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk"] Mar 13 10:00:01 crc kubenswrapper[4841]: W0313 10:00:01.056852 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42df74eb_bc5c_4ce2_b6ed_fe2297b57e94.slice/crio-c8d1374dd9789f07f0fd4df78bae04b10cbdab880e9ac38458a3652a281e5588 WatchSource:0}: Error finding container c8d1374dd9789f07f0fd4df78bae04b10cbdab880e9ac38458a3652a281e5588: Status 404 returned error can't find the container with id c8d1374dd9789f07f0fd4df78bae04b10cbdab880e9ac38458a3652a281e5588 Mar 13 10:00:01 crc kubenswrapper[4841]: I0313 10:00:01.162831 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556600-mjvph"] Mar 13 10:00:01 crc kubenswrapper[4841]: W0313 10:00:01.164936 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddff89fe5_b7c2_4581_8840_c12adc8826bd.slice/crio-ff5a2946f57e5dd4d223f2560362e3c455bf66ddb91b52d486cd8de76960f7d4 WatchSource:0}: Error finding container ff5a2946f57e5dd4d223f2560362e3c455bf66ddb91b52d486cd8de76960f7d4: Status 404 returned error can't find the container with id ff5a2946f57e5dd4d223f2560362e3c455bf66ddb91b52d486cd8de76960f7d4 Mar 13 10:00:01 crc kubenswrapper[4841]: I0313 10:00:01.832950 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556600-mjvph" event={"ID":"dff89fe5-b7c2-4581-8840-c12adc8826bd","Type":"ContainerStarted","Data":"ff5a2946f57e5dd4d223f2560362e3c455bf66ddb91b52d486cd8de76960f7d4"} Mar 13 10:00:01 crc kubenswrapper[4841]: I0313 10:00:01.834874 4841 generic.go:334] "Generic (PLEG): container finished" podID="42df74eb-bc5c-4ce2-b6ed-fe2297b57e94" containerID="a341597d6cd89f3cac5b740f16383d98a68101318cb66824598cc1ddf417a85b" exitCode=0 Mar 13 10:00:01 crc kubenswrapper[4841]: I0313 10:00:01.834934 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk" event={"ID":"42df74eb-bc5c-4ce2-b6ed-fe2297b57e94","Type":"ContainerDied","Data":"a341597d6cd89f3cac5b740f16383d98a68101318cb66824598cc1ddf417a85b"} Mar 13 10:00:01 crc kubenswrapper[4841]: I0313 10:00:01.834973 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk" event={"ID":"42df74eb-bc5c-4ce2-b6ed-fe2297b57e94","Type":"ContainerStarted","Data":"c8d1374dd9789f07f0fd4df78bae04b10cbdab880e9ac38458a3652a281e5588"} Mar 13 10:00:03 crc kubenswrapper[4841]: I0313 10:00:03.201446 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk" Mar 13 10:00:03 crc kubenswrapper[4841]: I0313 10:00:03.350621 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42df74eb-bc5c-4ce2-b6ed-fe2297b57e94-secret-volume\") pod \"42df74eb-bc5c-4ce2-b6ed-fe2297b57e94\" (UID: \"42df74eb-bc5c-4ce2-b6ed-fe2297b57e94\") " Mar 13 10:00:03 crc kubenswrapper[4841]: I0313 10:00:03.350849 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb79d\" (UniqueName: \"kubernetes.io/projected/42df74eb-bc5c-4ce2-b6ed-fe2297b57e94-kube-api-access-lb79d\") pod \"42df74eb-bc5c-4ce2-b6ed-fe2297b57e94\" (UID: \"42df74eb-bc5c-4ce2-b6ed-fe2297b57e94\") " Mar 13 10:00:03 crc kubenswrapper[4841]: I0313 10:00:03.350945 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42df74eb-bc5c-4ce2-b6ed-fe2297b57e94-config-volume\") pod \"42df74eb-bc5c-4ce2-b6ed-fe2297b57e94\" (UID: \"42df74eb-bc5c-4ce2-b6ed-fe2297b57e94\") " Mar 13 10:00:03 crc kubenswrapper[4841]: I0313 10:00:03.351774 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42df74eb-bc5c-4ce2-b6ed-fe2297b57e94-config-volume" (OuterVolumeSpecName: "config-volume") pod "42df74eb-bc5c-4ce2-b6ed-fe2297b57e94" (UID: "42df74eb-bc5c-4ce2-b6ed-fe2297b57e94"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:00:03 crc kubenswrapper[4841]: I0313 10:00:03.357439 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42df74eb-bc5c-4ce2-b6ed-fe2297b57e94-kube-api-access-lb79d" (OuterVolumeSpecName: "kube-api-access-lb79d") pod "42df74eb-bc5c-4ce2-b6ed-fe2297b57e94" (UID: "42df74eb-bc5c-4ce2-b6ed-fe2297b57e94"). InnerVolumeSpecName "kube-api-access-lb79d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:00:03 crc kubenswrapper[4841]: I0313 10:00:03.358482 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42df74eb-bc5c-4ce2-b6ed-fe2297b57e94-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "42df74eb-bc5c-4ce2-b6ed-fe2297b57e94" (UID: "42df74eb-bc5c-4ce2-b6ed-fe2297b57e94"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:00:03 crc kubenswrapper[4841]: I0313 10:00:03.452993 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42df74eb-bc5c-4ce2-b6ed-fe2297b57e94-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 10:00:03 crc kubenswrapper[4841]: I0313 10:00:03.453283 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb79d\" (UniqueName: \"kubernetes.io/projected/42df74eb-bc5c-4ce2-b6ed-fe2297b57e94-kube-api-access-lb79d\") on node \"crc\" DevicePath \"\"" Mar 13 10:00:03 crc kubenswrapper[4841]: I0313 10:00:03.453302 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42df74eb-bc5c-4ce2-b6ed-fe2297b57e94-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 10:00:03 crc kubenswrapper[4841]: I0313 10:00:03.857692 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk" event={"ID":"42df74eb-bc5c-4ce2-b6ed-fe2297b57e94","Type":"ContainerDied","Data":"c8d1374dd9789f07f0fd4df78bae04b10cbdab880e9ac38458a3652a281e5588"} Mar 13 10:00:03 crc kubenswrapper[4841]: I0313 10:00:03.857735 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8d1374dd9789f07f0fd4df78bae04b10cbdab880e9ac38458a3652a281e5588" Mar 13 10:00:03 crc kubenswrapper[4841]: I0313 10:00:03.857819 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556600-vs4mk" Mar 13 10:00:04 crc kubenswrapper[4841]: I0313 10:00:04.290717 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4"] Mar 13 10:00:04 crc kubenswrapper[4841]: I0313 10:00:04.299146 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556555-zbvz4"] Mar 13 10:00:04 crc kubenswrapper[4841]: I0313 10:00:04.866092 4841 generic.go:334] "Generic (PLEG): container finished" podID="dff89fe5-b7c2-4581-8840-c12adc8826bd" containerID="058d372d8e7682703b4bfb70c0ddb88de134c1d740462ca826ab1f9e6fbbc1ef" exitCode=0 Mar 13 10:00:04 crc kubenswrapper[4841]: I0313 10:00:04.866148 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556600-mjvph" event={"ID":"dff89fe5-b7c2-4581-8840-c12adc8826bd","Type":"ContainerDied","Data":"058d372d8e7682703b4bfb70c0ddb88de134c1d740462ca826ab1f9e6fbbc1ef"} Mar 13 10:00:06 crc kubenswrapper[4841]: I0313 10:00:06.019215 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1745fdf3-fbb9-4736-a4da-b534d8c208bd" path="/var/lib/kubelet/pods/1745fdf3-fbb9-4736-a4da-b534d8c208bd/volumes" Mar 13 10:00:06 crc kubenswrapper[4841]: I0313 10:00:06.192520 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556600-mjvph" Mar 13 10:00:06 crc kubenswrapper[4841]: I0313 10:00:06.307735 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bfng\" (UniqueName: \"kubernetes.io/projected/dff89fe5-b7c2-4581-8840-c12adc8826bd-kube-api-access-9bfng\") pod \"dff89fe5-b7c2-4581-8840-c12adc8826bd\" (UID: \"dff89fe5-b7c2-4581-8840-c12adc8826bd\") " Mar 13 10:00:06 crc kubenswrapper[4841]: I0313 10:00:06.314875 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff89fe5-b7c2-4581-8840-c12adc8826bd-kube-api-access-9bfng" (OuterVolumeSpecName: "kube-api-access-9bfng") pod "dff89fe5-b7c2-4581-8840-c12adc8826bd" (UID: "dff89fe5-b7c2-4581-8840-c12adc8826bd"). InnerVolumeSpecName "kube-api-access-9bfng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:00:06 crc kubenswrapper[4841]: I0313 10:00:06.410212 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bfng\" (UniqueName: \"kubernetes.io/projected/dff89fe5-b7c2-4581-8840-c12adc8826bd-kube-api-access-9bfng\") on node \"crc\" DevicePath \"\"" Mar 13 10:00:06 crc kubenswrapper[4841]: I0313 10:00:06.883030 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556600-mjvph" event={"ID":"dff89fe5-b7c2-4581-8840-c12adc8826bd","Type":"ContainerDied","Data":"ff5a2946f57e5dd4d223f2560362e3c455bf66ddb91b52d486cd8de76960f7d4"} Mar 13 10:00:06 crc kubenswrapper[4841]: I0313 10:00:06.883071 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff5a2946f57e5dd4d223f2560362e3c455bf66ddb91b52d486cd8de76960f7d4" Mar 13 10:00:06 crc kubenswrapper[4841]: I0313 10:00:06.883140 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556600-mjvph" Mar 13 10:00:07 crc kubenswrapper[4841]: I0313 10:00:07.253724 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556594-zbfhh"] Mar 13 10:00:07 crc kubenswrapper[4841]: I0313 10:00:07.262441 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556594-zbfhh"] Mar 13 10:00:08 crc kubenswrapper[4841]: I0313 10:00:08.003346 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbcc33bb-e899-4aee-8a7f-2f97beb83543" path="/var/lib/kubelet/pods/fbcc33bb-e899-4aee-8a7f-2f97beb83543/volumes" Mar 13 10:00:33 crc kubenswrapper[4841]: I0313 10:00:33.336814 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fd4cr"] Mar 13 10:00:33 crc kubenswrapper[4841]: E0313 10:00:33.337900 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff89fe5-b7c2-4581-8840-c12adc8826bd" containerName="oc" Mar 13 10:00:33 crc kubenswrapper[4841]: I0313 10:00:33.337913 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff89fe5-b7c2-4581-8840-c12adc8826bd" containerName="oc" Mar 13 10:00:33 crc kubenswrapper[4841]: E0313 10:00:33.337939 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42df74eb-bc5c-4ce2-b6ed-fe2297b57e94" containerName="collect-profiles" Mar 13 10:00:33 crc kubenswrapper[4841]: I0313 10:00:33.337945 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="42df74eb-bc5c-4ce2-b6ed-fe2297b57e94" containerName="collect-profiles" Mar 13 10:00:33 crc kubenswrapper[4841]: I0313 10:00:33.338125 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="42df74eb-bc5c-4ce2-b6ed-fe2297b57e94" containerName="collect-profiles" Mar 13 10:00:33 crc kubenswrapper[4841]: I0313 10:00:33.338143 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff89fe5-b7c2-4581-8840-c12adc8826bd" containerName="oc" Mar 13 10:00:33 crc kubenswrapper[4841]: I0313 10:00:33.339589 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fd4cr" Mar 13 10:00:33 crc kubenswrapper[4841]: I0313 10:00:33.366535 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fd4cr"] Mar 13 10:00:33 crc kubenswrapper[4841]: I0313 10:00:33.428776 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3996dfde-0955-4341-84a1-fa0515c99a0e-utilities\") pod \"redhat-marketplace-fd4cr\" (UID: \"3996dfde-0955-4341-84a1-fa0515c99a0e\") " pod="openshift-marketplace/redhat-marketplace-fd4cr" Mar 13 10:00:33 crc kubenswrapper[4841]: I0313 10:00:33.428845 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3996dfde-0955-4341-84a1-fa0515c99a0e-catalog-content\") pod \"redhat-marketplace-fd4cr\" (UID: \"3996dfde-0955-4341-84a1-fa0515c99a0e\") " pod="openshift-marketplace/redhat-marketplace-fd4cr" Mar 13 10:00:33 crc kubenswrapper[4841]: I0313 10:00:33.429437 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtqlx\" (UniqueName: \"kubernetes.io/projected/3996dfde-0955-4341-84a1-fa0515c99a0e-kube-api-access-qtqlx\") pod \"redhat-marketplace-fd4cr\" (UID: \"3996dfde-0955-4341-84a1-fa0515c99a0e\") " pod="openshift-marketplace/redhat-marketplace-fd4cr" Mar 13 10:00:33 crc kubenswrapper[4841]: I0313 10:00:33.531190 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3996dfde-0955-4341-84a1-fa0515c99a0e-utilities\") pod \"redhat-marketplace-fd4cr\" (UID: \"3996dfde-0955-4341-84a1-fa0515c99a0e\") " pod="openshift-marketplace/redhat-marketplace-fd4cr" Mar 13 10:00:33 crc kubenswrapper[4841]: I0313 10:00:33.531256 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3996dfde-0955-4341-84a1-fa0515c99a0e-catalog-content\") pod \"redhat-marketplace-fd4cr\" (UID: \"3996dfde-0955-4341-84a1-fa0515c99a0e\") " pod="openshift-marketplace/redhat-marketplace-fd4cr" Mar 13 10:00:33 crc kubenswrapper[4841]: I0313 10:00:33.531397 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtqlx\" (UniqueName: \"kubernetes.io/projected/3996dfde-0955-4341-84a1-fa0515c99a0e-kube-api-access-qtqlx\") pod \"redhat-marketplace-fd4cr\" (UID: \"3996dfde-0955-4341-84a1-fa0515c99a0e\") " pod="openshift-marketplace/redhat-marketplace-fd4cr" Mar 13 10:00:33 crc kubenswrapper[4841]: I0313 10:00:33.532103 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3996dfde-0955-4341-84a1-fa0515c99a0e-utilities\") pod \"redhat-marketplace-fd4cr\" (UID: \"3996dfde-0955-4341-84a1-fa0515c99a0e\") " pod="openshift-marketplace/redhat-marketplace-fd4cr" Mar 13 10:00:33 crc kubenswrapper[4841]: I0313 10:00:33.532138 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3996dfde-0955-4341-84a1-fa0515c99a0e-catalog-content\") pod \"redhat-marketplace-fd4cr\" (UID: \"3996dfde-0955-4341-84a1-fa0515c99a0e\") " pod="openshift-marketplace/redhat-marketplace-fd4cr" Mar 13 10:00:33 crc kubenswrapper[4841]: I0313 10:00:33.566463 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtqlx\" (UniqueName: \"kubernetes.io/projected/3996dfde-0955-4341-84a1-fa0515c99a0e-kube-api-access-qtqlx\") pod \"redhat-marketplace-fd4cr\" (UID: \"3996dfde-0955-4341-84a1-fa0515c99a0e\") " pod="openshift-marketplace/redhat-marketplace-fd4cr" Mar 13 10:00:33 crc kubenswrapper[4841]: I0313 10:00:33.670440 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fd4cr" Mar 13 10:00:34 crc kubenswrapper[4841]: I0313 10:00:34.174485 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fd4cr"] Mar 13 10:00:34 crc kubenswrapper[4841]: I0313 10:00:34.407235 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 10:00:34 crc kubenswrapper[4841]: I0313 10:00:34.407645 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 10:00:35 crc kubenswrapper[4841]: I0313 10:00:35.142034 4841 generic.go:334] "Generic (PLEG): container finished" podID="3996dfde-0955-4341-84a1-fa0515c99a0e" containerID="276b46cb5e0b83434a931012b39028f553b3208a84d3175dd5cb944c2ad13308" exitCode=0 Mar 13 10:00:35 crc kubenswrapper[4841]: I0313 10:00:35.142114 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd4cr" event={"ID":"3996dfde-0955-4341-84a1-fa0515c99a0e","Type":"ContainerDied","Data":"276b46cb5e0b83434a931012b39028f553b3208a84d3175dd5cb944c2ad13308"} Mar 13 10:00:35 crc kubenswrapper[4841]: I0313 10:00:35.142161 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd4cr" event={"ID":"3996dfde-0955-4341-84a1-fa0515c99a0e","Type":"ContainerStarted","Data":"ffb9ab46b806ec4dfc13b9eb69f64ffa42d68c9f2c49576524b7525376640bb0"} Mar 13 10:00:37 crc kubenswrapper[4841]: I0313 10:00:37.167295 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd4cr" event={"ID":"3996dfde-0955-4341-84a1-fa0515c99a0e","Type":"ContainerStarted","Data":"620a251bc5e843dfe5907cc2721036b53e2f796c7a26c2d52c2ab6094db7150e"} Mar 13 10:00:39 crc kubenswrapper[4841]: I0313 10:00:39.198598 4841 generic.go:334] "Generic (PLEG): container finished" podID="3996dfde-0955-4341-84a1-fa0515c99a0e" containerID="620a251bc5e843dfe5907cc2721036b53e2f796c7a26c2d52c2ab6094db7150e" exitCode=0 Mar 13 10:00:39 crc kubenswrapper[4841]: I0313 10:00:39.198686 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd4cr" event={"ID":"3996dfde-0955-4341-84a1-fa0515c99a0e","Type":"ContainerDied","Data":"620a251bc5e843dfe5907cc2721036b53e2f796c7a26c2d52c2ab6094db7150e"} Mar 13 10:00:40 crc kubenswrapper[4841]: I0313 10:00:40.209896 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd4cr" event={"ID":"3996dfde-0955-4341-84a1-fa0515c99a0e","Type":"ContainerStarted","Data":"6febe72f8da0c070a651c5d5a033a93d78ba9d883a6cb3e4749e756a0197b3c6"} Mar 13 10:00:40 crc kubenswrapper[4841]: I0313 10:00:40.235315 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fd4cr" podStartSLOduration=2.732222875 podStartE2EDuration="7.235295372s" podCreationTimestamp="2026-03-13 10:00:33 +0000 UTC" firstStartedPulling="2026-03-13 10:00:35.144721347 +0000 UTC m=+2917.874621538" lastFinishedPulling="2026-03-13 10:00:39.647793844 +0000 UTC m=+2922.377694035" observedRunningTime="2026-03-13 10:00:40.226516967 +0000 UTC m=+2922.956417158" watchObservedRunningTime="2026-03-13 10:00:40.235295372 +0000 UTC m=+2922.965195563" Mar 13 10:00:43 crc kubenswrapper[4841]: I0313 10:00:43.671536 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fd4cr" Mar 13 10:00:43 crc kubenswrapper[4841]: I0313 10:00:43.672192 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fd4cr" Mar 13 10:00:43 crc kubenswrapper[4841]: I0313 10:00:43.724041 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fd4cr" Mar 13 10:00:44 crc kubenswrapper[4841]: I0313 10:00:44.291728 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fd4cr" Mar 13 10:00:44 crc kubenswrapper[4841]: I0313 10:00:44.342842 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fd4cr"] Mar 13 10:00:46 crc kubenswrapper[4841]: I0313 10:00:46.260949 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fd4cr" podUID="3996dfde-0955-4341-84a1-fa0515c99a0e" containerName="registry-server" containerID="cri-o://6febe72f8da0c070a651c5d5a033a93d78ba9d883a6cb3e4749e756a0197b3c6" gracePeriod=2 Mar 13 10:00:46 crc kubenswrapper[4841]: I0313 10:00:46.719894 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fd4cr" Mar 13 10:00:46 crc kubenswrapper[4841]: I0313 10:00:46.814839 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3996dfde-0955-4341-84a1-fa0515c99a0e-utilities\") pod \"3996dfde-0955-4341-84a1-fa0515c99a0e\" (UID: \"3996dfde-0955-4341-84a1-fa0515c99a0e\") " Mar 13 10:00:46 crc kubenswrapper[4841]: I0313 10:00:46.814912 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3996dfde-0955-4341-84a1-fa0515c99a0e-catalog-content\") pod \"3996dfde-0955-4341-84a1-fa0515c99a0e\" (UID: \"3996dfde-0955-4341-84a1-fa0515c99a0e\") " Mar 13 10:00:46 crc kubenswrapper[4841]: I0313 10:00:46.814968 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtqlx\" (UniqueName: \"kubernetes.io/projected/3996dfde-0955-4341-84a1-fa0515c99a0e-kube-api-access-qtqlx\") pod \"3996dfde-0955-4341-84a1-fa0515c99a0e\" (UID: \"3996dfde-0955-4341-84a1-fa0515c99a0e\") " Mar 13 10:00:46 crc kubenswrapper[4841]: I0313 10:00:46.816005 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3996dfde-0955-4341-84a1-fa0515c99a0e-utilities" (OuterVolumeSpecName: "utilities") pod "3996dfde-0955-4341-84a1-fa0515c99a0e" (UID: "3996dfde-0955-4341-84a1-fa0515c99a0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:00:46 crc kubenswrapper[4841]: I0313 10:00:46.819897 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3996dfde-0955-4341-84a1-fa0515c99a0e-kube-api-access-qtqlx" (OuterVolumeSpecName: "kube-api-access-qtqlx") pod "3996dfde-0955-4341-84a1-fa0515c99a0e" (UID: "3996dfde-0955-4341-84a1-fa0515c99a0e"). InnerVolumeSpecName "kube-api-access-qtqlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:00:46 crc kubenswrapper[4841]: I0313 10:00:46.844518 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3996dfde-0955-4341-84a1-fa0515c99a0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3996dfde-0955-4341-84a1-fa0515c99a0e" (UID: "3996dfde-0955-4341-84a1-fa0515c99a0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:00:46 crc kubenswrapper[4841]: I0313 10:00:46.916690 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3996dfde-0955-4341-84a1-fa0515c99a0e-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 10:00:46 crc kubenswrapper[4841]: I0313 10:00:46.916726 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3996dfde-0955-4341-84a1-fa0515c99a0e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 10:00:46 crc kubenswrapper[4841]: I0313 10:00:46.916741 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtqlx\" (UniqueName: \"kubernetes.io/projected/3996dfde-0955-4341-84a1-fa0515c99a0e-kube-api-access-qtqlx\") on node \"crc\" DevicePath \"\"" Mar 13 10:00:47 crc kubenswrapper[4841]: I0313 10:00:47.273192 4841 generic.go:334] "Generic (PLEG): container finished" podID="3996dfde-0955-4341-84a1-fa0515c99a0e" containerID="6febe72f8da0c070a651c5d5a033a93d78ba9d883a6cb3e4749e756a0197b3c6" exitCode=0 Mar 13 10:00:47 crc kubenswrapper[4841]: I0313 10:00:47.273509 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fd4cr" Mar 13 10:00:47 crc kubenswrapper[4841]: I0313 10:00:47.273393 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd4cr" event={"ID":"3996dfde-0955-4341-84a1-fa0515c99a0e","Type":"ContainerDied","Data":"6febe72f8da0c070a651c5d5a033a93d78ba9d883a6cb3e4749e756a0197b3c6"} Mar 13 10:00:47 crc kubenswrapper[4841]: I0313 10:00:47.273634 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd4cr" event={"ID":"3996dfde-0955-4341-84a1-fa0515c99a0e","Type":"ContainerDied","Data":"ffb9ab46b806ec4dfc13b9eb69f64ffa42d68c9f2c49576524b7525376640bb0"} Mar 13 10:00:47 crc kubenswrapper[4841]: I0313 10:00:47.273662 4841 scope.go:117] "RemoveContainer" containerID="6febe72f8da0c070a651c5d5a033a93d78ba9d883a6cb3e4749e756a0197b3c6" Mar 13 10:00:47 crc kubenswrapper[4841]: I0313 10:00:47.299493 4841 scope.go:117] "RemoveContainer" containerID="620a251bc5e843dfe5907cc2721036b53e2f796c7a26c2d52c2ab6094db7150e" Mar 13 10:00:47 crc kubenswrapper[4841]: I0313 10:00:47.331078 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fd4cr"] Mar 13 10:00:47 crc kubenswrapper[4841]: I0313 10:00:47.334094 4841 scope.go:117] "RemoveContainer" containerID="276b46cb5e0b83434a931012b39028f553b3208a84d3175dd5cb944c2ad13308" Mar 13 10:00:47 crc kubenswrapper[4841]: I0313 10:00:47.339092 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fd4cr"] Mar 13 10:00:47 crc kubenswrapper[4841]: I0313 10:00:47.389493 4841 scope.go:117] "RemoveContainer" containerID="6febe72f8da0c070a651c5d5a033a93d78ba9d883a6cb3e4749e756a0197b3c6" Mar 13 10:00:47 crc kubenswrapper[4841]: E0313 10:00:47.390110 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6febe72f8da0c070a651c5d5a033a93d78ba9d883a6cb3e4749e756a0197b3c6\": container with ID starting with 6febe72f8da0c070a651c5d5a033a93d78ba9d883a6cb3e4749e756a0197b3c6 not found: ID does not exist" containerID="6febe72f8da0c070a651c5d5a033a93d78ba9d883a6cb3e4749e756a0197b3c6" Mar 13 10:00:47 crc kubenswrapper[4841]: I0313 10:00:47.390164 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6febe72f8da0c070a651c5d5a033a93d78ba9d883a6cb3e4749e756a0197b3c6"} err="failed to get container status \"6febe72f8da0c070a651c5d5a033a93d78ba9d883a6cb3e4749e756a0197b3c6\": rpc error: code = NotFound desc = could not find container \"6febe72f8da0c070a651c5d5a033a93d78ba9d883a6cb3e4749e756a0197b3c6\": container with ID starting with 6febe72f8da0c070a651c5d5a033a93d78ba9d883a6cb3e4749e756a0197b3c6 not found: ID does not exist" Mar 13 10:00:47 crc kubenswrapper[4841]: I0313 10:00:47.390196 4841 scope.go:117] "RemoveContainer" containerID="620a251bc5e843dfe5907cc2721036b53e2f796c7a26c2d52c2ab6094db7150e" Mar 13 10:00:47 crc kubenswrapper[4841]: E0313 10:00:47.390566 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"620a251bc5e843dfe5907cc2721036b53e2f796c7a26c2d52c2ab6094db7150e\": container with ID starting with 620a251bc5e843dfe5907cc2721036b53e2f796c7a26c2d52c2ab6094db7150e not found: ID does not exist" containerID="620a251bc5e843dfe5907cc2721036b53e2f796c7a26c2d52c2ab6094db7150e" Mar 13 10:00:47 crc kubenswrapper[4841]: I0313 10:00:47.390626 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"620a251bc5e843dfe5907cc2721036b53e2f796c7a26c2d52c2ab6094db7150e"} err="failed to get container status \"620a251bc5e843dfe5907cc2721036b53e2f796c7a26c2d52c2ab6094db7150e\": rpc error: code = NotFound desc = could not find container \"620a251bc5e843dfe5907cc2721036b53e2f796c7a26c2d52c2ab6094db7150e\": container with ID starting with 620a251bc5e843dfe5907cc2721036b53e2f796c7a26c2d52c2ab6094db7150e not found: ID does not exist" Mar 13 10:00:47 crc kubenswrapper[4841]: I0313 10:00:47.390651 4841 scope.go:117] "RemoveContainer" containerID="276b46cb5e0b83434a931012b39028f553b3208a84d3175dd5cb944c2ad13308" Mar 13 10:00:47 crc kubenswrapper[4841]: E0313 10:00:47.390929 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"276b46cb5e0b83434a931012b39028f553b3208a84d3175dd5cb944c2ad13308\": container with ID starting with 276b46cb5e0b83434a931012b39028f553b3208a84d3175dd5cb944c2ad13308 not found: ID does not exist" containerID="276b46cb5e0b83434a931012b39028f553b3208a84d3175dd5cb944c2ad13308" Mar 13 10:00:47 crc kubenswrapper[4841]: I0313 10:00:47.390953 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"276b46cb5e0b83434a931012b39028f553b3208a84d3175dd5cb944c2ad13308"} err="failed to get container status \"276b46cb5e0b83434a931012b39028f553b3208a84d3175dd5cb944c2ad13308\": rpc error: code = NotFound desc = could not find container \"276b46cb5e0b83434a931012b39028f553b3208a84d3175dd5cb944c2ad13308\": container with ID starting with 276b46cb5e0b83434a931012b39028f553b3208a84d3175dd5cb944c2ad13308 not found: ID does not exist" Mar 13 10:00:48 crc kubenswrapper[4841]: I0313 10:00:48.010632 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3996dfde-0955-4341-84a1-fa0515c99a0e" path="/var/lib/kubelet/pods/3996dfde-0955-4341-84a1-fa0515c99a0e/volumes" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.159924 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29556601-kkh2k"] Mar 13 10:01:00 crc kubenswrapper[4841]: E0313 10:01:00.161173 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3996dfde-0955-4341-84a1-fa0515c99a0e" containerName="extract-content" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.161192 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3996dfde-0955-4341-84a1-fa0515c99a0e" containerName="extract-content" Mar 13 10:01:00 crc kubenswrapper[4841]: E0313 10:01:00.161247 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3996dfde-0955-4341-84a1-fa0515c99a0e" containerName="registry-server" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.161256 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3996dfde-0955-4341-84a1-fa0515c99a0e" containerName="registry-server" Mar 13 10:01:00 crc kubenswrapper[4841]: E0313 10:01:00.161286 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3996dfde-0955-4341-84a1-fa0515c99a0e" containerName="extract-utilities" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.161296 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3996dfde-0955-4341-84a1-fa0515c99a0e" containerName="extract-utilities" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.161535 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3996dfde-0955-4341-84a1-fa0515c99a0e" containerName="registry-server" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.162420 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29556601-kkh2k" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.173800 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29556601-kkh2k"] Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.193411 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a76cc0-6588-4160-8580-766a47f207e6-combined-ca-bundle\") pod \"keystone-cron-29556601-kkh2k\" (UID: \"d8a76cc0-6588-4160-8580-766a47f207e6\") " pod="openstack/keystone-cron-29556601-kkh2k" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.193483 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8a76cc0-6588-4160-8580-766a47f207e6-fernet-keys\") pod \"keystone-cron-29556601-kkh2k\" (UID: \"d8a76cc0-6588-4160-8580-766a47f207e6\") " pod="openstack/keystone-cron-29556601-kkh2k" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.193711 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a76cc0-6588-4160-8580-766a47f207e6-config-data\") pod \"keystone-cron-29556601-kkh2k\" (UID: \"d8a76cc0-6588-4160-8580-766a47f207e6\") " pod="openstack/keystone-cron-29556601-kkh2k" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.193869 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhzq9\" (UniqueName: \"kubernetes.io/projected/d8a76cc0-6588-4160-8580-766a47f207e6-kube-api-access-mhzq9\") pod \"keystone-cron-29556601-kkh2k\" (UID: \"d8a76cc0-6588-4160-8580-766a47f207e6\") " pod="openstack/keystone-cron-29556601-kkh2k" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.296858 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a76cc0-6588-4160-8580-766a47f207e6-combined-ca-bundle\") pod \"keystone-cron-29556601-kkh2k\" (UID: \"d8a76cc0-6588-4160-8580-766a47f207e6\") " pod="openstack/keystone-cron-29556601-kkh2k" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.296924 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8a76cc0-6588-4160-8580-766a47f207e6-fernet-keys\") pod \"keystone-cron-29556601-kkh2k\" (UID: \"d8a76cc0-6588-4160-8580-766a47f207e6\") " pod="openstack/keystone-cron-29556601-kkh2k" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.296970 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a76cc0-6588-4160-8580-766a47f207e6-config-data\") pod \"keystone-cron-29556601-kkh2k\" (UID: \"d8a76cc0-6588-4160-8580-766a47f207e6\") " pod="openstack/keystone-cron-29556601-kkh2k" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.297033 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhzq9\" (UniqueName: \"kubernetes.io/projected/d8a76cc0-6588-4160-8580-766a47f207e6-kube-api-access-mhzq9\") pod \"keystone-cron-29556601-kkh2k\" (UID: \"d8a76cc0-6588-4160-8580-766a47f207e6\") " pod="openstack/keystone-cron-29556601-kkh2k" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.303240 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8a76cc0-6588-4160-8580-766a47f207e6-fernet-keys\") pod \"keystone-cron-29556601-kkh2k\" (UID: \"d8a76cc0-6588-4160-8580-766a47f207e6\") " pod="openstack/keystone-cron-29556601-kkh2k" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.303659 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a76cc0-6588-4160-8580-766a47f207e6-config-data\") pod \"keystone-cron-29556601-kkh2k\" (UID: \"d8a76cc0-6588-4160-8580-766a47f207e6\") " pod="openstack/keystone-cron-29556601-kkh2k" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.304368 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a76cc0-6588-4160-8580-766a47f207e6-combined-ca-bundle\") pod \"keystone-cron-29556601-kkh2k\" (UID: \"d8a76cc0-6588-4160-8580-766a47f207e6\") " pod="openstack/keystone-cron-29556601-kkh2k" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.313789 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhzq9\" (UniqueName: \"kubernetes.io/projected/d8a76cc0-6588-4160-8580-766a47f207e6-kube-api-access-mhzq9\") pod \"keystone-cron-29556601-kkh2k\" (UID: \"d8a76cc0-6588-4160-8580-766a47f207e6\") " pod="openstack/keystone-cron-29556601-kkh2k" Mar 13 10:01:00 crc kubenswrapper[4841]: I0313 10:01:00.504906 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29556601-kkh2k" Mar 13 10:01:01 crc kubenswrapper[4841]: I0313 10:01:01.920223 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29556601-kkh2k"] Mar 13 10:01:02 crc kubenswrapper[4841]: I0313 10:01:02.439483 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29556601-kkh2k" event={"ID":"d8a76cc0-6588-4160-8580-766a47f207e6","Type":"ContainerStarted","Data":"af0fdcde6615493ef13c4e28169bce03fb510dc4b3e7004304bf4841d27ab2a1"} Mar 13 10:01:02 crc kubenswrapper[4841]: I0313 10:01:02.439879 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29556601-kkh2k" event={"ID":"d8a76cc0-6588-4160-8580-766a47f207e6","Type":"ContainerStarted","Data":"2bc6a65b7618fc70a9d0d48a40be544af8e07c0917fb2dc1c5e3940b1c428bf2"} Mar 13 10:01:02 crc kubenswrapper[4841]: I0313 10:01:02.468866 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29556601-kkh2k" podStartSLOduration=2.468844846 podStartE2EDuration="2.468844846s" podCreationTimestamp="2026-03-13 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:01:02.452921366 +0000 UTC m=+2945.182821577" watchObservedRunningTime="2026-03-13 10:01:02.468844846 +0000 UTC m=+2945.198745057" Mar 13 10:01:04 crc kubenswrapper[4841]: I0313 10:01:04.407139 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 10:01:04 crc kubenswrapper[4841]: I0313 10:01:04.407503 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 10:01:04 crc kubenswrapper[4841]: I0313 10:01:04.487008 4841 generic.go:334] "Generic (PLEG): container finished" podID="d8a76cc0-6588-4160-8580-766a47f207e6" containerID="af0fdcde6615493ef13c4e28169bce03fb510dc4b3e7004304bf4841d27ab2a1" exitCode=0 Mar 13 10:01:04 crc kubenswrapper[4841]: I0313 10:01:04.487051 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29556601-kkh2k" event={"ID":"d8a76cc0-6588-4160-8580-766a47f207e6","Type":"ContainerDied","Data":"af0fdcde6615493ef13c4e28169bce03fb510dc4b3e7004304bf4841d27ab2a1"} Mar 13 10:01:05 crc kubenswrapper[4841]: I0313 10:01:05.172516 4841 scope.go:117] "RemoveContainer" containerID="c140fa68774d5707fef40bcf227291d5b8cf288c1244f6933833bc2b04e31f7a" Mar 13 10:01:05 crc kubenswrapper[4841]: I0313 10:01:05.199701 4841 scope.go:117] "RemoveContainer" containerID="06eb8fef5e582cbf6ca81120e42cdac1e2257a0ab9785b210a2ca494d491f9ca" Mar 13 10:01:05 crc kubenswrapper[4841]: I0313 10:01:05.884323 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29556601-kkh2k" Mar 13 10:01:06 crc kubenswrapper[4841]: I0313 10:01:06.017435 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a76cc0-6588-4160-8580-766a47f207e6-combined-ca-bundle\") pod \"d8a76cc0-6588-4160-8580-766a47f207e6\" (UID: \"d8a76cc0-6588-4160-8580-766a47f207e6\") " Mar 13 10:01:06 crc kubenswrapper[4841]: I0313 10:01:06.017525 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a76cc0-6588-4160-8580-766a47f207e6-config-data\") pod \"d8a76cc0-6588-4160-8580-766a47f207e6\" (UID: \"d8a76cc0-6588-4160-8580-766a47f207e6\") " Mar 13 10:01:06 crc kubenswrapper[4841]: I0313 10:01:06.017567 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8a76cc0-6588-4160-8580-766a47f207e6-fernet-keys\") pod \"d8a76cc0-6588-4160-8580-766a47f207e6\" (UID: \"d8a76cc0-6588-4160-8580-766a47f207e6\") " Mar 13 10:01:06 crc kubenswrapper[4841]: I0313 10:01:06.017873 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhzq9\" (UniqueName: \"kubernetes.io/projected/d8a76cc0-6588-4160-8580-766a47f207e6-kube-api-access-mhzq9\") pod \"d8a76cc0-6588-4160-8580-766a47f207e6\" (UID: \"d8a76cc0-6588-4160-8580-766a47f207e6\") " Mar 13 10:01:06 crc kubenswrapper[4841]: I0313 10:01:06.023493 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8a76cc0-6588-4160-8580-766a47f207e6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d8a76cc0-6588-4160-8580-766a47f207e6" (UID: "d8a76cc0-6588-4160-8580-766a47f207e6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:01:06 crc kubenswrapper[4841]: I0313 10:01:06.023679 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a76cc0-6588-4160-8580-766a47f207e6-kube-api-access-mhzq9" (OuterVolumeSpecName: "kube-api-access-mhzq9") pod "d8a76cc0-6588-4160-8580-766a47f207e6" (UID: "d8a76cc0-6588-4160-8580-766a47f207e6"). InnerVolumeSpecName "kube-api-access-mhzq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:01:06 crc kubenswrapper[4841]: I0313 10:01:06.047386 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8a76cc0-6588-4160-8580-766a47f207e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8a76cc0-6588-4160-8580-766a47f207e6" (UID: "d8a76cc0-6588-4160-8580-766a47f207e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:01:06 crc kubenswrapper[4841]: I0313 10:01:06.085202 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8a76cc0-6588-4160-8580-766a47f207e6-config-data" (OuterVolumeSpecName: "config-data") pod "d8a76cc0-6588-4160-8580-766a47f207e6" (UID: "d8a76cc0-6588-4160-8580-766a47f207e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:01:06 crc kubenswrapper[4841]: I0313 10:01:06.121328 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhzq9\" (UniqueName: \"kubernetes.io/projected/d8a76cc0-6588-4160-8580-766a47f207e6-kube-api-access-mhzq9\") on node \"crc\" DevicePath \"\"" Mar 13 10:01:06 crc kubenswrapper[4841]: I0313 10:01:06.121601 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a76cc0-6588-4160-8580-766a47f207e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 10:01:06 crc kubenswrapper[4841]: I0313 10:01:06.121692 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a76cc0-6588-4160-8580-766a47f207e6-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 10:01:06 crc kubenswrapper[4841]: I0313 10:01:06.121788 4841 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8a76cc0-6588-4160-8580-766a47f207e6-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 10:01:06 crc kubenswrapper[4841]: I0313 10:01:06.505754 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29556601-kkh2k" event={"ID":"d8a76cc0-6588-4160-8580-766a47f207e6","Type":"ContainerDied","Data":"2bc6a65b7618fc70a9d0d48a40be544af8e07c0917fb2dc1c5e3940b1c428bf2"} Mar 13 10:01:06 crc kubenswrapper[4841]: I0313 10:01:06.506089 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bc6a65b7618fc70a9d0d48a40be544af8e07c0917fb2dc1c5e3940b1c428bf2" Mar 13 10:01:06 crc kubenswrapper[4841]: I0313 10:01:06.505812 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29556601-kkh2k" Mar 13 10:01:34 crc kubenswrapper[4841]: I0313 10:01:34.407244 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 10:01:34 crc kubenswrapper[4841]: I0313 10:01:34.407907 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 10:01:34 crc kubenswrapper[4841]: I0313 10:01:34.407970 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 10:01:34 crc kubenswrapper[4841]: I0313 10:01:34.408796 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b"} pod="openshift-machine-config-operator/machine-config-daemon-h227v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 10:01:34 crc kubenswrapper[4841]: I0313 10:01:34.408888 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" containerID="cri-o://e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" gracePeriod=600 Mar 13 10:01:34 crc kubenswrapper[4841]: E0313 10:01:34.554998 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:01:34 crc kubenswrapper[4841]: I0313 10:01:34.781369 4841 generic.go:334] "Generic (PLEG): container finished" podID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" exitCode=0 Mar 13 10:01:34 crc kubenswrapper[4841]: I0313 10:01:34.781417 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerDied","Data":"e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b"} Mar 13 10:01:34 crc kubenswrapper[4841]: I0313 10:01:34.781479 4841 scope.go:117] "RemoveContainer" containerID="356a0566b4669fa620204281729f9d4a5c82961594bf8afbfe26e440c8bc1ad1" Mar 13 10:01:34 crc kubenswrapper[4841]: I0313 10:01:34.782004 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:01:34 crc kubenswrapper[4841]: E0313 10:01:34.782371 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:01:45 crc kubenswrapper[4841]: I0313 10:01:45.996504 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:01:46 crc kubenswrapper[4841]: E0313 10:01:45.997567 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:01:56 crc kubenswrapper[4841]: I0313 10:01:56.995644 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:01:56 crc kubenswrapper[4841]: E0313 10:01:56.996763 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:02:00 crc kubenswrapper[4841]: I0313 10:02:00.158490 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556602-n6zgh"] Mar 13 10:02:00 crc kubenswrapper[4841]: E0313 10:02:00.159616 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a76cc0-6588-4160-8580-766a47f207e6" containerName="keystone-cron" Mar 13 10:02:00 crc kubenswrapper[4841]: I0313 10:02:00.159633 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a76cc0-6588-4160-8580-766a47f207e6" containerName="keystone-cron" Mar 13 10:02:00 crc kubenswrapper[4841]: I0313 10:02:00.159866 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a76cc0-6588-4160-8580-766a47f207e6" containerName="keystone-cron" Mar 13 10:02:00 crc kubenswrapper[4841]: I0313 10:02:00.160752 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556602-n6zgh" Mar 13 10:02:00 crc kubenswrapper[4841]: I0313 10:02:00.162989 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 10:02:00 crc kubenswrapper[4841]: I0313 10:02:00.163237 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 10:02:00 crc kubenswrapper[4841]: I0313 10:02:00.163291 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 10:02:00 crc kubenswrapper[4841]: I0313 10:02:00.169057 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556602-n6zgh"] Mar 13 10:02:00 crc kubenswrapper[4841]: I0313 10:02:00.338886 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxvp8\" (UniqueName: \"kubernetes.io/projected/7bdbe8c3-132b-4800-b0ce-f9ef46c6deee-kube-api-access-hxvp8\") pod \"auto-csr-approver-29556602-n6zgh\" (UID: \"7bdbe8c3-132b-4800-b0ce-f9ef46c6deee\") " pod="openshift-infra/auto-csr-approver-29556602-n6zgh" Mar 13 10:02:00 crc kubenswrapper[4841]: I0313 10:02:00.441090 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxvp8\" (UniqueName: \"kubernetes.io/projected/7bdbe8c3-132b-4800-b0ce-f9ef46c6deee-kube-api-access-hxvp8\") pod \"auto-csr-approver-29556602-n6zgh\" (UID: \"7bdbe8c3-132b-4800-b0ce-f9ef46c6deee\") " pod="openshift-infra/auto-csr-approver-29556602-n6zgh" Mar 13 10:02:00 crc kubenswrapper[4841]: I0313 10:02:00.458940 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxvp8\" (UniqueName: \"kubernetes.io/projected/7bdbe8c3-132b-4800-b0ce-f9ef46c6deee-kube-api-access-hxvp8\") pod \"auto-csr-approver-29556602-n6zgh\" (UID: \"7bdbe8c3-132b-4800-b0ce-f9ef46c6deee\") " pod="openshift-infra/auto-csr-approver-29556602-n6zgh" Mar 13 10:02:00 crc kubenswrapper[4841]: I0313 10:02:00.502133 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556602-n6zgh" Mar 13 10:02:00 crc kubenswrapper[4841]: I0313 10:02:00.924726 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556602-n6zgh"] Mar 13 10:02:00 crc kubenswrapper[4841]: I0313 10:02:00.927256 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 10:02:01 crc kubenswrapper[4841]: I0313 10:02:01.043999 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556602-n6zgh" event={"ID":"7bdbe8c3-132b-4800-b0ce-f9ef46c6deee","Type":"ContainerStarted","Data":"348554711a3889f75868ef09c584c240618e59cc9848eaf7a44c1420a1804786"} Mar 13 10:02:03 crc kubenswrapper[4841]: I0313 10:02:03.065996 4841 generic.go:334] "Generic (PLEG): container finished" podID="7bdbe8c3-132b-4800-b0ce-f9ef46c6deee" containerID="b1fb7408425da464e82cd6947cedcefc62f27744b8c3a9c8f1f751e492a82024" exitCode=0 Mar 13 10:02:03 crc kubenswrapper[4841]: I0313 10:02:03.066066 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556602-n6zgh" event={"ID":"7bdbe8c3-132b-4800-b0ce-f9ef46c6deee","Type":"ContainerDied","Data":"b1fb7408425da464e82cd6947cedcefc62f27744b8c3a9c8f1f751e492a82024"} Mar 13 10:02:04 crc kubenswrapper[4841]: I0313 10:02:04.370143 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556602-n6zgh" Mar 13 10:02:04 crc kubenswrapper[4841]: I0313 10:02:04.415362 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxvp8\" (UniqueName: \"kubernetes.io/projected/7bdbe8c3-132b-4800-b0ce-f9ef46c6deee-kube-api-access-hxvp8\") pod \"7bdbe8c3-132b-4800-b0ce-f9ef46c6deee\" (UID: \"7bdbe8c3-132b-4800-b0ce-f9ef46c6deee\") " Mar 13 10:02:04 crc kubenswrapper[4841]: I0313 10:02:04.424642 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bdbe8c3-132b-4800-b0ce-f9ef46c6deee-kube-api-access-hxvp8" (OuterVolumeSpecName: "kube-api-access-hxvp8") pod "7bdbe8c3-132b-4800-b0ce-f9ef46c6deee" (UID: "7bdbe8c3-132b-4800-b0ce-f9ef46c6deee"). InnerVolumeSpecName "kube-api-access-hxvp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:02:04 crc kubenswrapper[4841]: I0313 10:02:04.519589 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxvp8\" (UniqueName: \"kubernetes.io/projected/7bdbe8c3-132b-4800-b0ce-f9ef46c6deee-kube-api-access-hxvp8\") on node \"crc\" DevicePath \"\"" Mar 13 10:02:05 crc kubenswrapper[4841]: I0313 10:02:05.088480 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556602-n6zgh" event={"ID":"7bdbe8c3-132b-4800-b0ce-f9ef46c6deee","Type":"ContainerDied","Data":"348554711a3889f75868ef09c584c240618e59cc9848eaf7a44c1420a1804786"} Mar 13 10:02:05 crc kubenswrapper[4841]: I0313 10:02:05.088525 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="348554711a3889f75868ef09c584c240618e59cc9848eaf7a44c1420a1804786" Mar 13 10:02:05 crc kubenswrapper[4841]: I0313 10:02:05.088612 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556602-n6zgh" Mar 13 10:02:05 crc kubenswrapper[4841]: I0313 10:02:05.431559 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556596-nv6jl"] Mar 13 10:02:05 crc kubenswrapper[4841]: I0313 10:02:05.440479 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556596-nv6jl"] Mar 13 10:02:06 crc kubenswrapper[4841]: I0313 10:02:06.008868 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f926f80-bd81-4cf9-991e-17c01455ee8a" path="/var/lib/kubelet/pods/6f926f80-bd81-4cf9-991e-17c01455ee8a/volumes" Mar 13 10:02:08 crc kubenswrapper[4841]: I0313 10:02:08.997582 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:02:08 crc kubenswrapper[4841]: E0313 10:02:08.998111 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:02:19 crc kubenswrapper[4841]: I0313 10:02:19.995238 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:02:19 crc kubenswrapper[4841]: E0313 10:02:19.997193 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:02:33 crc kubenswrapper[4841]: I0313 10:02:33.995418 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:02:33 crc kubenswrapper[4841]: E0313 10:02:33.996408 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:02:36 crc kubenswrapper[4841]: I0313 10:02:36.113209 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-8485bdb9db-mf5lp_17824e5f-18b3-46c0-910a-56e5529e09c3/manager/0.log" Mar 13 10:02:44 crc kubenswrapper[4841]: I0313 10:02:44.994531 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:02:44 crc kubenswrapper[4841]: E0313 10:02:44.995335 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:02:49 crc kubenswrapper[4841]: I0313 10:02:49.867330 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6"] Mar 13 10:02:49 crc kubenswrapper[4841]: E0313 10:02:49.868156 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bdbe8c3-132b-4800-b0ce-f9ef46c6deee" containerName="oc" Mar 13 10:02:49 crc kubenswrapper[4841]: I0313 10:02:49.868171 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bdbe8c3-132b-4800-b0ce-f9ef46c6deee" containerName="oc" Mar 13 10:02:49 crc kubenswrapper[4841]: I0313 10:02:49.868427 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bdbe8c3-132b-4800-b0ce-f9ef46c6deee" containerName="oc" Mar 13 10:02:49 crc kubenswrapper[4841]: I0313 10:02:49.870147 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6" Mar 13 10:02:49 crc kubenswrapper[4841]: I0313 10:02:49.872408 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 10:02:49 crc kubenswrapper[4841]: I0313 10:02:49.878762 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6"] Mar 13 10:02:49 crc kubenswrapper[4841]: I0313 10:02:49.899625 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/646b8b76-d9eb-4e25-bbe5-b6d42b9f0961-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6\" (UID: \"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6" Mar 13 10:02:49 crc kubenswrapper[4841]: I0313 10:02:49.900133 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/646b8b76-d9eb-4e25-bbe5-b6d42b9f0961-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6\" (UID: \"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6" Mar 13 10:02:49 crc kubenswrapper[4841]: I0313 10:02:49.900307 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbg75\" (UniqueName: \"kubernetes.io/projected/646b8b76-d9eb-4e25-bbe5-b6d42b9f0961-kube-api-access-rbg75\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6\" (UID: \"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6" Mar 13 10:02:50 crc kubenswrapper[4841]: I0313 10:02:50.001156 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbg75\" (UniqueName: \"kubernetes.io/projected/646b8b76-d9eb-4e25-bbe5-b6d42b9f0961-kube-api-access-rbg75\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6\" (UID: \"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6" Mar 13 10:02:50 crc kubenswrapper[4841]: I0313 10:02:50.002053 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/646b8b76-d9eb-4e25-bbe5-b6d42b9f0961-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6\" (UID: \"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6" Mar 13 10:02:50 crc kubenswrapper[4841]: I0313 10:02:50.002088 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/646b8b76-d9eb-4e25-bbe5-b6d42b9f0961-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6\" (UID: \"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6" Mar 13 10:02:50 crc kubenswrapper[4841]: I0313 10:02:50.002765 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/646b8b76-d9eb-4e25-bbe5-b6d42b9f0961-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6\" (UID: \"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6" Mar 13 10:02:50 crc kubenswrapper[4841]: I0313 10:02:50.003289 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/646b8b76-d9eb-4e25-bbe5-b6d42b9f0961-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6\" (UID: \"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6" Mar 13 10:02:50 crc kubenswrapper[4841]: I0313 10:02:50.025984 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbg75\" (UniqueName: \"kubernetes.io/projected/646b8b76-d9eb-4e25-bbe5-b6d42b9f0961-kube-api-access-rbg75\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6\" (UID: \"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6" Mar 13 10:02:50 crc kubenswrapper[4841]: I0313 10:02:50.190187 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6" Mar 13 10:02:50 crc kubenswrapper[4841]: I0313 10:02:50.680941 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6"] Mar 13 10:02:51 crc kubenswrapper[4841]: I0313 10:02:51.559824 4841 generic.go:334] "Generic (PLEG): container finished" podID="646b8b76-d9eb-4e25-bbe5-b6d42b9f0961" containerID="c028d71f8e8251d2aa743b46e575435a41dd594c2bbc570eaf11b19c9d634aa7" exitCode=0 Mar 13 10:02:51 crc kubenswrapper[4841]: I0313 10:02:51.559866 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6" event={"ID":"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961","Type":"ContainerDied","Data":"c028d71f8e8251d2aa743b46e575435a41dd594c2bbc570eaf11b19c9d634aa7"} Mar 13 10:02:51 crc kubenswrapper[4841]: I0313 10:02:51.560408 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6" event={"ID":"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961","Type":"ContainerStarted","Data":"d1ca2495896f144844bacc0d8a2ad964d355e75e2bb144a10db319bde187f316"} Mar 13 10:02:53 crc kubenswrapper[4841]: I0313 10:02:53.578025 4841 generic.go:334] "Generic (PLEG): container finished" podID="646b8b76-d9eb-4e25-bbe5-b6d42b9f0961" containerID="157fdf2512ebcf2a527a4e96ed88efa1e4e14a8ec33d1cf70afda0d79a8806b2" exitCode=0 Mar 13 10:02:53 crc kubenswrapper[4841]: I0313 10:02:53.578105 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6" event={"ID":"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961","Type":"ContainerDied","Data":"157fdf2512ebcf2a527a4e96ed88efa1e4e14a8ec33d1cf70afda0d79a8806b2"} Mar 13 10:02:54 crc kubenswrapper[4841]: I0313 10:02:54.589191 4841 generic.go:334] "Generic (PLEG): container finished" podID="646b8b76-d9eb-4e25-bbe5-b6d42b9f0961" containerID="8cfb1e50d25a9e99e8f2f86da803af5dce758d399f85c6912a80e027b5b30d65" exitCode=0 Mar 13 10:02:54 crc kubenswrapper[4841]: I0313 10:02:54.589341 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6" event={"ID":"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961","Type":"ContainerDied","Data":"8cfb1e50d25a9e99e8f2f86da803af5dce758d399f85c6912a80e027b5b30d65"} Mar 13 10:02:55 crc kubenswrapper[4841]: I0313 10:02:55.957097 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6" Mar 13 10:02:56 crc kubenswrapper[4841]: I0313 10:02:56.021006 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/646b8b76-d9eb-4e25-bbe5-b6d42b9f0961-bundle\") pod \"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961\" (UID: \"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961\") " Mar 13 10:02:56 crc kubenswrapper[4841]: I0313 10:02:56.021144 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/646b8b76-d9eb-4e25-bbe5-b6d42b9f0961-util\") pod \"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961\" (UID: \"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961\") " Mar 13 10:02:56 crc kubenswrapper[4841]: I0313 10:02:56.021486 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbg75\" (UniqueName: \"kubernetes.io/projected/646b8b76-d9eb-4e25-bbe5-b6d42b9f0961-kube-api-access-rbg75\") pod \"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961\" (UID: \"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961\") " Mar 13 10:02:56 crc kubenswrapper[4841]: I0313 10:02:56.023732 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/646b8b76-d9eb-4e25-bbe5-b6d42b9f0961-bundle" (OuterVolumeSpecName: "bundle") pod "646b8b76-d9eb-4e25-bbe5-b6d42b9f0961" (UID: "646b8b76-d9eb-4e25-bbe5-b6d42b9f0961"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:02:56 crc kubenswrapper[4841]: I0313 10:02:56.026737 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/646b8b76-d9eb-4e25-bbe5-b6d42b9f0961-kube-api-access-rbg75" (OuterVolumeSpecName: "kube-api-access-rbg75") pod "646b8b76-d9eb-4e25-bbe5-b6d42b9f0961" (UID: "646b8b76-d9eb-4e25-bbe5-b6d42b9f0961"). InnerVolumeSpecName "kube-api-access-rbg75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:02:56 crc kubenswrapper[4841]: I0313 10:02:56.036891 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/646b8b76-d9eb-4e25-bbe5-b6d42b9f0961-util" (OuterVolumeSpecName: "util") pod "646b8b76-d9eb-4e25-bbe5-b6d42b9f0961" (UID: "646b8b76-d9eb-4e25-bbe5-b6d42b9f0961"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:02:56 crc kubenswrapper[4841]: I0313 10:02:56.124092 4841 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/646b8b76-d9eb-4e25-bbe5-b6d42b9f0961-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 10:02:56 crc kubenswrapper[4841]: I0313 10:02:56.124143 4841 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/646b8b76-d9eb-4e25-bbe5-b6d42b9f0961-util\") on node \"crc\" DevicePath \"\"" Mar 13 10:02:56 crc kubenswrapper[4841]: I0313 10:02:56.124157 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbg75\" (UniqueName: \"kubernetes.io/projected/646b8b76-d9eb-4e25-bbe5-b6d42b9f0961-kube-api-access-rbg75\") on node \"crc\" DevicePath \"\"" Mar 13 10:02:56 crc kubenswrapper[4841]: I0313 10:02:56.608786 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6" event={"ID":"646b8b76-d9eb-4e25-bbe5-b6d42b9f0961","Type":"ContainerDied","Data":"d1ca2495896f144844bacc0d8a2ad964d355e75e2bb144a10db319bde187f316"} Mar 13 10:02:56 crc kubenswrapper[4841]: I0313 10:02:56.609090 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1ca2495896f144844bacc0d8a2ad964d355e75e2bb144a10db319bde187f316" Mar 13 10:02:56 crc kubenswrapper[4841]: I0313 10:02:56.608899 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6" Mar 13 10:02:56 crc kubenswrapper[4841]: I0313 10:02:56.994998 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:02:56 crc kubenswrapper[4841]: E0313 10:02:56.996541 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:03:05 crc kubenswrapper[4841]: I0313 10:03:05.358552 4841 scope.go:117] "RemoveContainer" containerID="f972b828b5e15c0256a1e19ac41c80e97d71b89ae99f857d028f4c3b0b316aee" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.838552 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-z7wl5"] Mar 13 10:03:07 crc kubenswrapper[4841]: E0313 10:03:07.839398 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="646b8b76-d9eb-4e25-bbe5-b6d42b9f0961" containerName="util" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.839411 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="646b8b76-d9eb-4e25-bbe5-b6d42b9f0961" containerName="util" Mar 13 10:03:07 crc kubenswrapper[4841]: E0313 10:03:07.839425 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="646b8b76-d9eb-4e25-bbe5-b6d42b9f0961" containerName="pull" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.839430 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="646b8b76-d9eb-4e25-bbe5-b6d42b9f0961" containerName="pull" Mar 13 10:03:07 crc kubenswrapper[4841]: E0313 10:03:07.839440 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="646b8b76-d9eb-4e25-bbe5-b6d42b9f0961" containerName="extract" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.839446 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="646b8b76-d9eb-4e25-bbe5-b6d42b9f0961" containerName="extract" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.839634 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="646b8b76-d9eb-4e25-bbe5-b6d42b9f0961" containerName="extract" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.840320 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-z7wl5" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.843016 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.843106 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.844583 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-kgzqd" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.859052 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-z7wl5"] Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.863991 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khcck\" (UniqueName: \"kubernetes.io/projected/08be6515-b41c-481b-ba89-b939e4cfa067-kube-api-access-khcck\") pod \"obo-prometheus-operator-68bc856cb9-z7wl5\" (UID: \"08be6515-b41c-481b-ba89-b939e4cfa067\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-z7wl5" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.890710 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-646dp"] Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.892768 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-646dp" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.897805 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.897971 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-29pwz" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.924059 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-njcw5"] Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.925720 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-njcw5" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.956353 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-646dp"] Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.966435 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3fde31d7-89e1-4aa5-a848-2b018eae16b1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5586657968-njcw5\" (UID: \"3fde31d7-89e1-4aa5-a848-2b018eae16b1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-njcw5" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.966570 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3304bfd0-8191-45c7-8c50-f16e137a6de8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5586657968-646dp\" (UID: \"3304bfd0-8191-45c7-8c50-f16e137a6de8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-646dp" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.966598 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3304bfd0-8191-45c7-8c50-f16e137a6de8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5586657968-646dp\" (UID: \"3304bfd0-8191-45c7-8c50-f16e137a6de8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-646dp" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.966689 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khcck\" (UniqueName: \"kubernetes.io/projected/08be6515-b41c-481b-ba89-b939e4cfa067-kube-api-access-khcck\") pod \"obo-prometheus-operator-68bc856cb9-z7wl5\" (UID: \"08be6515-b41c-481b-ba89-b939e4cfa067\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-z7wl5" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.966842 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3fde31d7-89e1-4aa5-a848-2b018eae16b1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5586657968-njcw5\" (UID: \"3fde31d7-89e1-4aa5-a848-2b018eae16b1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-njcw5" Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.968055 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-njcw5"] Mar 13 10:03:07 crc kubenswrapper[4841]: I0313 10:03:07.993738 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khcck\" (UniqueName: \"kubernetes.io/projected/08be6515-b41c-481b-ba89-b939e4cfa067-kube-api-access-khcck\") pod \"obo-prometheus-operator-68bc856cb9-z7wl5\" (UID: \"08be6515-b41c-481b-ba89-b939e4cfa067\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-z7wl5" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.069839 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3fde31d7-89e1-4aa5-a848-2b018eae16b1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5586657968-njcw5\" (UID: \"3fde31d7-89e1-4aa5-a848-2b018eae16b1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-njcw5" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.071372 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3fde31d7-89e1-4aa5-a848-2b018eae16b1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5586657968-njcw5\" (UID: \"3fde31d7-89e1-4aa5-a848-2b018eae16b1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-njcw5" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.071518 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3304bfd0-8191-45c7-8c50-f16e137a6de8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5586657968-646dp\" (UID: \"3304bfd0-8191-45c7-8c50-f16e137a6de8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-646dp" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.071546 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3304bfd0-8191-45c7-8c50-f16e137a6de8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5586657968-646dp\" (UID: \"3304bfd0-8191-45c7-8c50-f16e137a6de8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-646dp" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.078731 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3fde31d7-89e1-4aa5-a848-2b018eae16b1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5586657968-njcw5\" (UID: \"3fde31d7-89e1-4aa5-a848-2b018eae16b1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-njcw5" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.081834 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3304bfd0-8191-45c7-8c50-f16e137a6de8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5586657968-646dp\" (UID: \"3304bfd0-8191-45c7-8c50-f16e137a6de8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-646dp" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.082189 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3fde31d7-89e1-4aa5-a848-2b018eae16b1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5586657968-njcw5\" (UID: \"3fde31d7-89e1-4aa5-a848-2b018eae16b1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-njcw5" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.086311 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-5vnkd"] Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.087759 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-5vnkd" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.102702 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-pl2fz" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.102820 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.103325 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3304bfd0-8191-45c7-8c50-f16e137a6de8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5586657968-646dp\" (UID: \"3304bfd0-8191-45c7-8c50-f16e137a6de8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-646dp" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.136484 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-5vnkd"] Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.161818 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-z7wl5" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.222771 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-646dp" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.249174 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-njcw5" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.265772 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-r5b4d"] Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.267713 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-r5b4d" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.272139 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-qrrrd" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.275214 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7pnf\" (UniqueName: \"kubernetes.io/projected/64eb3c86-385d-45d5-8dee-df851d8c3a74-kube-api-access-r7pnf\") pod \"observability-operator-59bdc8b94-5vnkd\" (UID: \"64eb3c86-385d-45d5-8dee-df851d8c3a74\") " pod="openshift-operators/observability-operator-59bdc8b94-5vnkd" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.275907 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/64eb3c86-385d-45d5-8dee-df851d8c3a74-observability-operator-tls\") pod \"observability-operator-59bdc8b94-5vnkd\" (UID: \"64eb3c86-385d-45d5-8dee-df851d8c3a74\") " pod="openshift-operators/observability-operator-59bdc8b94-5vnkd" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.304719 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-r5b4d"] Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.385546 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b49125a7-562a-421b-b5eb-126312e6e85d-openshift-service-ca\") pod \"perses-operator-5bf474d74f-r5b4d\" (UID: \"b49125a7-562a-421b-b5eb-126312e6e85d\") " pod="openshift-operators/perses-operator-5bf474d74f-r5b4d" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.385869 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/64eb3c86-385d-45d5-8dee-df851d8c3a74-observability-operator-tls\") pod \"observability-operator-59bdc8b94-5vnkd\" (UID: \"64eb3c86-385d-45d5-8dee-df851d8c3a74\") " pod="openshift-operators/observability-operator-59bdc8b94-5vnkd" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.385930 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkkcg\" (UniqueName: \"kubernetes.io/projected/b49125a7-562a-421b-b5eb-126312e6e85d-kube-api-access-vkkcg\") pod \"perses-operator-5bf474d74f-r5b4d\" (UID: \"b49125a7-562a-421b-b5eb-126312e6e85d\") " pod="openshift-operators/perses-operator-5bf474d74f-r5b4d" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.385972 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7pnf\" (UniqueName: \"kubernetes.io/projected/64eb3c86-385d-45d5-8dee-df851d8c3a74-kube-api-access-r7pnf\") pod \"observability-operator-59bdc8b94-5vnkd\" (UID: \"64eb3c86-385d-45d5-8dee-df851d8c3a74\") " pod="openshift-operators/observability-operator-59bdc8b94-5vnkd" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.391110 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/64eb3c86-385d-45d5-8dee-df851d8c3a74-observability-operator-tls\") pod \"observability-operator-59bdc8b94-5vnkd\" (UID: \"64eb3c86-385d-45d5-8dee-df851d8c3a74\") " pod="openshift-operators/observability-operator-59bdc8b94-5vnkd" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.415528 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7pnf\" (UniqueName: \"kubernetes.io/projected/64eb3c86-385d-45d5-8dee-df851d8c3a74-kube-api-access-r7pnf\") pod \"observability-operator-59bdc8b94-5vnkd\" (UID: \"64eb3c86-385d-45d5-8dee-df851d8c3a74\") " pod="openshift-operators/observability-operator-59bdc8b94-5vnkd" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.478217 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-5vnkd" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.487986 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkkcg\" (UniqueName: \"kubernetes.io/projected/b49125a7-562a-421b-b5eb-126312e6e85d-kube-api-access-vkkcg\") pod \"perses-operator-5bf474d74f-r5b4d\" (UID: \"b49125a7-562a-421b-b5eb-126312e6e85d\") " pod="openshift-operators/perses-operator-5bf474d74f-r5b4d" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.489010 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b49125a7-562a-421b-b5eb-126312e6e85d-openshift-service-ca\") pod \"perses-operator-5bf474d74f-r5b4d\" (UID: \"b49125a7-562a-421b-b5eb-126312e6e85d\") " pod="openshift-operators/perses-operator-5bf474d74f-r5b4d" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.489820 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b49125a7-562a-421b-b5eb-126312e6e85d-openshift-service-ca\") pod \"perses-operator-5bf474d74f-r5b4d\" (UID: \"b49125a7-562a-421b-b5eb-126312e6e85d\") " pod="openshift-operators/perses-operator-5bf474d74f-r5b4d" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.504718 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkkcg\" (UniqueName: \"kubernetes.io/projected/b49125a7-562a-421b-b5eb-126312e6e85d-kube-api-access-vkkcg\") pod \"perses-operator-5bf474d74f-r5b4d\" (UID: \"b49125a7-562a-421b-b5eb-126312e6e85d\") " pod="openshift-operators/perses-operator-5bf474d74f-r5b4d" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.608112 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-r5b4d" Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.861404 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-z7wl5"] Mar 13 10:03:08 crc kubenswrapper[4841]: W0313 10:03:08.868061 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08be6515_b41c_481b_ba89_b939e4cfa067.slice/crio-fbcf5d21ad5567ba2ea71500452ee5870eeb7177fb784a48a5b08efcc37e3cd1 WatchSource:0}: Error finding container fbcf5d21ad5567ba2ea71500452ee5870eeb7177fb784a48a5b08efcc37e3cd1: Status 404 returned error can't find the container with id fbcf5d21ad5567ba2ea71500452ee5870eeb7177fb784a48a5b08efcc37e3cd1 Mar 13 10:03:08 crc kubenswrapper[4841]: I0313 10:03:08.874981 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-njcw5"] Mar 13 10:03:09 crc kubenswrapper[4841]: I0313 10:03:09.049317 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-646dp"] Mar 13 10:03:09 crc kubenswrapper[4841]: I0313 10:03:09.148897 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-5vnkd"] Mar 13 10:03:09 crc kubenswrapper[4841]: I0313 10:03:09.231367 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-r5b4d"] Mar 13 10:03:09 crc kubenswrapper[4841]: I0313 10:03:09.763584 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-5vnkd" event={"ID":"64eb3c86-385d-45d5-8dee-df851d8c3a74","Type":"ContainerStarted","Data":"11518436bb0b44c1907b2275aedc27cc1cd94d5fc51a5f5ed3f3432103af1cd6"} Mar 13 10:03:09 crc kubenswrapper[4841]: I0313 10:03:09.764584 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-z7wl5" event={"ID":"08be6515-b41c-481b-ba89-b939e4cfa067","Type":"ContainerStarted","Data":"fbcf5d21ad5567ba2ea71500452ee5870eeb7177fb784a48a5b08efcc37e3cd1"} Mar 13 10:03:09 crc kubenswrapper[4841]: I0313 10:03:09.765817 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-r5b4d" event={"ID":"b49125a7-562a-421b-b5eb-126312e6e85d","Type":"ContainerStarted","Data":"49485e55ab157c7093c9de89ecc7d832e1a9aefd90f681f6b86dc49d16a9312e"} Mar 13 10:03:09 crc kubenswrapper[4841]: I0313 10:03:09.767242 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-646dp" event={"ID":"3304bfd0-8191-45c7-8c50-f16e137a6de8","Type":"ContainerStarted","Data":"c0e149f75c38f30ca54f6ac6f2da55e7d7699527eb41f6f71ae956c948286b99"} Mar 13 10:03:09 crc kubenswrapper[4841]: I0313 10:03:09.768426 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-njcw5" event={"ID":"3fde31d7-89e1-4aa5-a848-2b018eae16b1","Type":"ContainerStarted","Data":"a28090589f40e7b6259b528b785536f4d53d9619ab3652b7967cf9afd15c8102"} Mar 13 10:03:11 crc kubenswrapper[4841]: I0313 10:03:11.996284 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:03:11 crc kubenswrapper[4841]: E0313 10:03:11.997034 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:03:22 crc kubenswrapper[4841]: I0313 10:03:22.969598 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-5vnkd" event={"ID":"64eb3c86-385d-45d5-8dee-df851d8c3a74","Type":"ContainerStarted","Data":"420b0789142cdc9d03ed59d53f538a18c6239e7887358904d4fe17fbb1894dc2"} Mar 13 10:03:22 crc kubenswrapper[4841]: I0313 10:03:22.971684 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-5vnkd" Mar 13 10:03:22 crc kubenswrapper[4841]: I0313 10:03:22.973969 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-z7wl5" event={"ID":"08be6515-b41c-481b-ba89-b939e4cfa067","Type":"ContainerStarted","Data":"5f3d7ea0463036c1c576aea5b8f5bd9218743022ddabf2699bdc71d88980b8ae"} Mar 13 10:03:22 crc kubenswrapper[4841]: I0313 10:03:22.974033 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-5vnkd" Mar 13 10:03:22 crc kubenswrapper[4841]: I0313 10:03:22.976834 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-r5b4d" event={"ID":"b49125a7-562a-421b-b5eb-126312e6e85d","Type":"ContainerStarted","Data":"840ff20892b922e755663f08740f82b72c7cb10ede3a22abb532c8db109fee02"} Mar 13 10:03:22 crc kubenswrapper[4841]: I0313 10:03:22.976891 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-r5b4d" Mar 13 10:03:22 crc kubenswrapper[4841]: I0313 10:03:22.979872 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-646dp" event={"ID":"3304bfd0-8191-45c7-8c50-f16e137a6de8","Type":"ContainerStarted","Data":"a677af93441a94bbb1b666614c2094658cccbd38a547e4917e86be2843d824c7"} Mar 13 10:03:22 crc kubenswrapper[4841]: I0313 10:03:22.983665 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-njcw5" event={"ID":"3fde31d7-89e1-4aa5-a848-2b018eae16b1","Type":"ContainerStarted","Data":"0b50af64d2535a353b67b64a7d049352abe5e79c445bc8197af2a30435346b1c"} Mar 13 10:03:23 crc kubenswrapper[4841]: I0313 10:03:23.010388 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-5vnkd" podStartSLOduration=2.168315984 podStartE2EDuration="15.010368312s" podCreationTimestamp="2026-03-13 10:03:08 +0000 UTC" firstStartedPulling="2026-03-13 10:03:09.150572921 +0000 UTC m=+3071.880473112" lastFinishedPulling="2026-03-13 10:03:21.992625249 +0000 UTC m=+3084.722525440" observedRunningTime="2026-03-13 10:03:22.999132289 +0000 UTC m=+3085.729032480" watchObservedRunningTime="2026-03-13 10:03:23.010368312 +0000 UTC m=+3085.740268503" Mar 13 10:03:23 crc kubenswrapper[4841]: I0313 10:03:23.039111 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-646dp" podStartSLOduration=3.156822533 podStartE2EDuration="16.039095403s" podCreationTimestamp="2026-03-13 10:03:07 +0000 UTC" firstStartedPulling="2026-03-13 10:03:09.075150225 +0000 UTC m=+3071.805050416" lastFinishedPulling="2026-03-13 10:03:21.957423095 +0000 UTC m=+3084.687323286" observedRunningTime="2026-03-13 10:03:23.029682818 +0000 UTC m=+3085.759583019" watchObservedRunningTime="2026-03-13 10:03:23.039095403 +0000 UTC m=+3085.768995594" Mar 13 10:03:23 crc kubenswrapper[4841]: I0313 10:03:23.054167 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-z7wl5" podStartSLOduration=2.972532155 podStartE2EDuration="16.054147396s" podCreationTimestamp="2026-03-13 10:03:07 +0000 UTC" firstStartedPulling="2026-03-13 10:03:08.879567631 +0000 UTC m=+3071.609467822" lastFinishedPulling="2026-03-13 10:03:21.961182862 +0000 UTC m=+3084.691083063" observedRunningTime="2026-03-13 10:03:23.046433394 +0000 UTC m=+3085.776333605" watchObservedRunningTime="2026-03-13 10:03:23.054147396 +0000 UTC m=+3085.784047587" Mar 13 10:03:23 crc kubenswrapper[4841]: I0313 10:03:23.086840 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5586657968-njcw5" podStartSLOduration=2.944904007 podStartE2EDuration="16.0868147s" podCreationTimestamp="2026-03-13 10:03:07 +0000 UTC" firstStartedPulling="2026-03-13 10:03:08.879190489 +0000 UTC m=+3071.609090680" lastFinishedPulling="2026-03-13 10:03:22.021101182 +0000 UTC m=+3084.751001373" observedRunningTime="2026-03-13 10:03:23.076238858 +0000 UTC m=+3085.806139059" watchObservedRunningTime="2026-03-13 10:03:23.0868147 +0000 UTC m=+3085.816714891" Mar 13 10:03:23 crc kubenswrapper[4841]: I0313 10:03:23.133749 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-r5b4d" podStartSLOduration=2.33998527 podStartE2EDuration="15.133713221s" podCreationTimestamp="2026-03-13 10:03:08 +0000 UTC" firstStartedPulling="2026-03-13 10:03:09.232228133 +0000 UTC m=+3071.962128324" lastFinishedPulling="2026-03-13 10:03:22.025956084 +0000 UTC m=+3084.755856275" observedRunningTime="2026-03-13 10:03:23.105930569 +0000 UTC m=+3085.835830780" watchObservedRunningTime="2026-03-13 10:03:23.133713221 +0000 UTC m=+3085.863613422" Mar 13 10:03:25 crc kubenswrapper[4841]: I0313 10:03:25.994895 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:03:25 crc kubenswrapper[4841]: E0313 10:03:25.995317 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:03:28 crc kubenswrapper[4841]: I0313 10:03:28.611590 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-r5b4d" Mar 13 10:03:34 crc kubenswrapper[4841]: I0313 10:03:34.822123 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 13 10:03:34 crc kubenswrapper[4841]: I0313 10:03:34.822832 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerName="aodh-api" containerID="cri-o://73835459515e92b6fecc0b2b0b6798f41fedf7faff0780f2db183d4b540ebff6" gracePeriod=30 Mar 13 10:03:34 crc kubenswrapper[4841]: I0313 10:03:34.822912 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerName="aodh-notifier" containerID="cri-o://c5f39dd7085c62ebf12a9ffc76f9361cec35b3449ce3d5620b567c775aaeebe1" gracePeriod=30 Mar 13 10:03:34 crc kubenswrapper[4841]: I0313 10:03:34.822957 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerName="aodh-listener" containerID="cri-o://d1b17ba81a867d2473cf0859557d2418bfbc4051fa4f1d552488545d4fb5eef7" gracePeriod=30 Mar 13 10:03:34 crc kubenswrapper[4841]: I0313 10:03:34.823011 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerName="aodh-evaluator" containerID="cri-o://01148df9a621eb905ab6c92e2b7827e6590aa358e16603c785656acaaee5c402" gracePeriod=30 Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.352908 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.355787 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.359047 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.363654 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.364017 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.364315 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.364554 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.365246 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-gcpbr" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.394625 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0812602d-3596-4cda-b90a-d2f76f67bf52-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.394688 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0812602d-3596-4cda-b90a-d2f76f67bf52-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.394722 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/0812602d-3596-4cda-b90a-d2f76f67bf52-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.394768 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0812602d-3596-4cda-b90a-d2f76f67bf52-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.394804 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0812602d-3596-4cda-b90a-d2f76f67bf52-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.394943 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx5jj\" (UniqueName: \"kubernetes.io/projected/0812602d-3596-4cda-b90a-d2f76f67bf52-kube-api-access-xx5jj\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.394979 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0812602d-3596-4cda-b90a-d2f76f67bf52-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.501154 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0812602d-3596-4cda-b90a-d2f76f67bf52-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.501223 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/0812602d-3596-4cda-b90a-d2f76f67bf52-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.501312 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0812602d-3596-4cda-b90a-d2f76f67bf52-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.501355 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0812602d-3596-4cda-b90a-d2f76f67bf52-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.501875 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/0812602d-3596-4cda-b90a-d2f76f67bf52-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.502739 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx5jj\" (UniqueName: \"kubernetes.io/projected/0812602d-3596-4cda-b90a-d2f76f67bf52-kube-api-access-xx5jj\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.502805 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0812602d-3596-4cda-b90a-d2f76f67bf52-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.502944 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0812602d-3596-4cda-b90a-d2f76f67bf52-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.509785 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0812602d-3596-4cda-b90a-d2f76f67bf52-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.510083 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0812602d-3596-4cda-b90a-d2f76f67bf52-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.512311 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0812602d-3596-4cda-b90a-d2f76f67bf52-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.514252 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0812602d-3596-4cda-b90a-d2f76f67bf52-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.515962 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0812602d-3596-4cda-b90a-d2f76f67bf52-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.525123 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx5jj\" (UniqueName: \"kubernetes.io/projected/0812602d-3596-4cda-b90a-d2f76f67bf52-kube-api-access-xx5jj\") pod \"alertmanager-metric-storage-0\" (UID: \"0812602d-3596-4cda-b90a-d2f76f67bf52\") " pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.718890 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.966692 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.972714 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.975164 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.976407 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.976623 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.976899 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.977041 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.977166 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.977404 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-wmghm" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.977535 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 13 10:03:35 crc kubenswrapper[4841]: I0313 10:03:35.991777 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.116212 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/00ae5b61-6360-4fa5-ba8d-83852827f1b7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.116281 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwjf5\" (UniqueName: \"kubernetes.io/projected/00ae5b61-6360-4fa5-ba8d-83852827f1b7-kube-api-access-kwjf5\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.116383 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/00ae5b61-6360-4fa5-ba8d-83852827f1b7-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.117300 4841 generic.go:334] "Generic (PLEG): container finished" podID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerID="d1b17ba81a867d2473cf0859557d2418bfbc4051fa4f1d552488545d4fb5eef7" exitCode=0 Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.117340 4841 generic.go:334] "Generic (PLEG): container finished" podID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerID="01148df9a621eb905ab6c92e2b7827e6590aa358e16603c785656acaaee5c402" exitCode=0 Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.117352 4841 generic.go:334] "Generic (PLEG): container finished" podID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerID="73835459515e92b6fecc0b2b0b6798f41fedf7faff0780f2db183d4b540ebff6" exitCode=0 Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.117380 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c4e8e7c7-61ec-4f17-aa35-66780c71735a","Type":"ContainerDied","Data":"d1b17ba81a867d2473cf0859557d2418bfbc4051fa4f1d552488545d4fb5eef7"} Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.117412 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c4e8e7c7-61ec-4f17-aa35-66780c71735a","Type":"ContainerDied","Data":"01148df9a621eb905ab6c92e2b7827e6590aa358e16603c785656acaaee5c402"} Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.117425 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c4e8e7c7-61ec-4f17-aa35-66780c71735a","Type":"ContainerDied","Data":"73835459515e92b6fecc0b2b0b6798f41fedf7faff0780f2db183d4b540ebff6"} Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.118412 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/00ae5b61-6360-4fa5-ba8d-83852827f1b7-config\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.118521 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/00ae5b61-6360-4fa5-ba8d-83852827f1b7-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.118559 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.118606 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/00ae5b61-6360-4fa5-ba8d-83852827f1b7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.118631 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/00ae5b61-6360-4fa5-ba8d-83852827f1b7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.118925 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/00ae5b61-6360-4fa5-ba8d-83852827f1b7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.119012 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/00ae5b61-6360-4fa5-ba8d-83852827f1b7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.221378 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/00ae5b61-6360-4fa5-ba8d-83852827f1b7-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.221468 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/00ae5b61-6360-4fa5-ba8d-83852827f1b7-config\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.221500 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/00ae5b61-6360-4fa5-ba8d-83852827f1b7-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.221541 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.221561 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/00ae5b61-6360-4fa5-ba8d-83852827f1b7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.221579 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/00ae5b61-6360-4fa5-ba8d-83852827f1b7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.221694 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/00ae5b61-6360-4fa5-ba8d-83852827f1b7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.221728 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/00ae5b61-6360-4fa5-ba8d-83852827f1b7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.221804 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/00ae5b61-6360-4fa5-ba8d-83852827f1b7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.221838 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwjf5\" (UniqueName: \"kubernetes.io/projected/00ae5b61-6360-4fa5-ba8d-83852827f1b7-kube-api-access-kwjf5\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.223097 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/00ae5b61-6360-4fa5-ba8d-83852827f1b7-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.224726 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.225059 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/00ae5b61-6360-4fa5-ba8d-83852827f1b7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.225682 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/00ae5b61-6360-4fa5-ba8d-83852827f1b7-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.231627 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/00ae5b61-6360-4fa5-ba8d-83852827f1b7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.232000 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/00ae5b61-6360-4fa5-ba8d-83852827f1b7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.232090 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/00ae5b61-6360-4fa5-ba8d-83852827f1b7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.232164 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/00ae5b61-6360-4fa5-ba8d-83852827f1b7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.235866 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/00ae5b61-6360-4fa5-ba8d-83852827f1b7-config\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.245756 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwjf5\" (UniqueName: \"kubernetes.io/projected/00ae5b61-6360-4fa5-ba8d-83852827f1b7-kube-api-access-kwjf5\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.275287 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"prometheus-metric-storage-0\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.305778 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.315598 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 13 10:03:36 crc kubenswrapper[4841]: W0313 10:03:36.324896 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0812602d_3596_4cda_b90a_d2f76f67bf52.slice/crio-92c31fd540e6aaec5685422f3bef2340cfd62f5b0b5bb94589922b254d3df183 WatchSource:0}: Error finding container 92c31fd540e6aaec5685422f3bef2340cfd62f5b0b5bb94589922b254d3df183: Status 404 returned error can't find the container with id 92c31fd540e6aaec5685422f3bef2340cfd62f5b0b5bb94589922b254d3df183 Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.868783 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:03:36 crc kubenswrapper[4841]: I0313 10:03:36.994900 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:03:36 crc kubenswrapper[4841]: E0313 10:03:36.995149 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:03:37 crc kubenswrapper[4841]: I0313 10:03:37.126702 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"00ae5b61-6360-4fa5-ba8d-83852827f1b7","Type":"ContainerStarted","Data":"0831f15052730c2307c831aaabe672db1384e1c6b18b2667af1e3adc2c70d415"} Mar 13 10:03:37 crc kubenswrapper[4841]: I0313 10:03:37.128117 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0812602d-3596-4cda-b90a-d2f76f67bf52","Type":"ContainerStarted","Data":"92c31fd540e6aaec5685422f3bef2340cfd62f5b0b5bb94589922b254d3df183"} Mar 13 10:03:43 crc kubenswrapper[4841]: I0313 10:03:43.222036 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0812602d-3596-4cda-b90a-d2f76f67bf52","Type":"ContainerStarted","Data":"654c5196cfd0aafdd41c94e70b0cfc86fff83c58a1cf1a1d403a490ccd77efaa"} Mar 13 10:03:43 crc kubenswrapper[4841]: I0313 10:03:43.224133 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"00ae5b61-6360-4fa5-ba8d-83852827f1b7","Type":"ContainerStarted","Data":"59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9"} Mar 13 10:03:48 crc kubenswrapper[4841]: I0313 10:03:48.006623 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:03:48 crc kubenswrapper[4841]: E0313 10:03:48.007255 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:03:50 crc kubenswrapper[4841]: I0313 10:03:50.321014 4841 generic.go:334] "Generic (PLEG): container finished" podID="0812602d-3596-4cda-b90a-d2f76f67bf52" containerID="654c5196cfd0aafdd41c94e70b0cfc86fff83c58a1cf1a1d403a490ccd77efaa" exitCode=0 Mar 13 10:03:50 crc kubenswrapper[4841]: I0313 10:03:50.321106 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0812602d-3596-4cda-b90a-d2f76f67bf52","Type":"ContainerDied","Data":"654c5196cfd0aafdd41c94e70b0cfc86fff83c58a1cf1a1d403a490ccd77efaa"} Mar 13 10:03:50 crc kubenswrapper[4841]: I0313 10:03:50.328573 4841 generic.go:334] "Generic (PLEG): container finished" podID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" containerID="59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9" exitCode=0 Mar 13 10:03:50 crc kubenswrapper[4841]: I0313 10:03:50.328705 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"00ae5b61-6360-4fa5-ba8d-83852827f1b7","Type":"ContainerDied","Data":"59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9"} Mar 13 10:03:53 crc kubenswrapper[4841]: I0313 10:03:53.359991 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0812602d-3596-4cda-b90a-d2f76f67bf52","Type":"ContainerStarted","Data":"6a451db8de7e42205bd8987476cced368deca308dab23f4e11758214b9713225"} Mar 13 10:03:56 crc kubenswrapper[4841]: I0313 10:03:56.390673 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"00ae5b61-6360-4fa5-ba8d-83852827f1b7","Type":"ContainerStarted","Data":"5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934"} Mar 13 10:03:57 crc kubenswrapper[4841]: I0313 10:03:57.401353 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0812602d-3596-4cda-b90a-d2f76f67bf52","Type":"ContainerStarted","Data":"01fe1d96f6a04bbb3edf71eaa8edac8399344af861bb9c0be7226f64e5e2a104"} Mar 13 10:03:57 crc kubenswrapper[4841]: I0313 10:03:57.401580 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:57 crc kubenswrapper[4841]: I0313 10:03:57.404365 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Mar 13 10:03:57 crc kubenswrapper[4841]: I0313 10:03:57.436549 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.208902204 podStartE2EDuration="22.436528167s" podCreationTimestamp="2026-03-13 10:03:35 +0000 UTC" firstStartedPulling="2026-03-13 10:03:36.338679552 +0000 UTC m=+3099.068579763" lastFinishedPulling="2026-03-13 10:03:52.566305535 +0000 UTC m=+3115.296205726" observedRunningTime="2026-03-13 10:03:57.429783975 +0000 UTC m=+3120.159684176" watchObservedRunningTime="2026-03-13 10:03:57.436528167 +0000 UTC m=+3120.166428358" Mar 13 10:03:58 crc kubenswrapper[4841]: I0313 10:03:58.995125 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:03:58 crc kubenswrapper[4841]: E0313 10:03:58.995689 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:04:00 crc kubenswrapper[4841]: I0313 10:04:00.170676 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556604-8sknz"] Mar 13 10:04:00 crc kubenswrapper[4841]: I0313 10:04:00.173356 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556604-8sknz" Mar 13 10:04:00 crc kubenswrapper[4841]: I0313 10:04:00.175916 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 10:04:00 crc kubenswrapper[4841]: I0313 10:04:00.175980 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 10:04:00 crc kubenswrapper[4841]: I0313 10:04:00.176696 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 10:04:00 crc kubenswrapper[4841]: I0313 10:04:00.186353 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556604-8sknz"] Mar 13 10:04:00 crc kubenswrapper[4841]: I0313 10:04:00.292197 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjz69\" (UniqueName: \"kubernetes.io/projected/f84e5588-63df-41c9-98c7-00d3fc4db098-kube-api-access-xjz69\") pod \"auto-csr-approver-29556604-8sknz\" (UID: \"f84e5588-63df-41c9-98c7-00d3fc4db098\") " pod="openshift-infra/auto-csr-approver-29556604-8sknz" Mar 13 10:04:00 crc kubenswrapper[4841]: I0313 10:04:00.395253 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjz69\" (UniqueName: \"kubernetes.io/projected/f84e5588-63df-41c9-98c7-00d3fc4db098-kube-api-access-xjz69\") pod \"auto-csr-approver-29556604-8sknz\" (UID: \"f84e5588-63df-41c9-98c7-00d3fc4db098\") " pod="openshift-infra/auto-csr-approver-29556604-8sknz" Mar 13 10:04:00 crc kubenswrapper[4841]: I0313 10:04:00.419968 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjz69\" (UniqueName: \"kubernetes.io/projected/f84e5588-63df-41c9-98c7-00d3fc4db098-kube-api-access-xjz69\") pod \"auto-csr-approver-29556604-8sknz\" (UID: \"f84e5588-63df-41c9-98c7-00d3fc4db098\") " pod="openshift-infra/auto-csr-approver-29556604-8sknz" Mar 13 10:04:00 crc kubenswrapper[4841]: I0313 10:04:00.439861 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"00ae5b61-6360-4fa5-ba8d-83852827f1b7","Type":"ContainerStarted","Data":"e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6"} Mar 13 10:04:00 crc kubenswrapper[4841]: I0313 10:04:00.497537 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556604-8sknz" Mar 13 10:04:00 crc kubenswrapper[4841]: I0313 10:04:00.958174 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556604-8sknz"] Mar 13 10:04:00 crc kubenswrapper[4841]: W0313 10:04:00.962490 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf84e5588_63df_41c9_98c7_00d3fc4db098.slice/crio-d1a43b299bb8388b551a0a7ab8edc1e0002b6aa18d866f8e08d2bd37895f598c WatchSource:0}: Error finding container d1a43b299bb8388b551a0a7ab8edc1e0002b6aa18d866f8e08d2bd37895f598c: Status 404 returned error can't find the container with id d1a43b299bb8388b551a0a7ab8edc1e0002b6aa18d866f8e08d2bd37895f598c Mar 13 10:04:01 crc kubenswrapper[4841]: I0313 10:04:01.450216 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556604-8sknz" event={"ID":"f84e5588-63df-41c9-98c7-00d3fc4db098","Type":"ContainerStarted","Data":"d1a43b299bb8388b551a0a7ab8edc1e0002b6aa18d866f8e08d2bd37895f598c"} Mar 13 10:04:03 crc kubenswrapper[4841]: I0313 10:04:03.471466 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"00ae5b61-6360-4fa5-ba8d-83852827f1b7","Type":"ContainerStarted","Data":"051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe"} Mar 13 10:04:03 crc kubenswrapper[4841]: I0313 10:04:03.474627 4841 generic.go:334] "Generic (PLEG): container finished" podID="f84e5588-63df-41c9-98c7-00d3fc4db098" containerID="83eca72557fd86e5bdc54519f8860e4988c13a432b4be4c841422700a869c39e" exitCode=0 Mar 13 10:04:03 crc kubenswrapper[4841]: I0313 10:04:03.474678 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556604-8sknz" event={"ID":"f84e5588-63df-41c9-98c7-00d3fc4db098","Type":"ContainerDied","Data":"83eca72557fd86e5bdc54519f8860e4988c13a432b4be4c841422700a869c39e"} Mar 13 10:04:03 crc kubenswrapper[4841]: I0313 10:04:03.523035 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.052283326 podStartE2EDuration="29.522992013s" podCreationTimestamp="2026-03-13 10:03:34 +0000 UTC" firstStartedPulling="2026-03-13 10:03:36.875032776 +0000 UTC m=+3099.604932967" lastFinishedPulling="2026-03-13 10:04:02.345741463 +0000 UTC m=+3125.075641654" observedRunningTime="2026-03-13 10:04:03.508572871 +0000 UTC m=+3126.238473062" watchObservedRunningTime="2026-03-13 10:04:03.522992013 +0000 UTC m=+3126.252892204" Mar 13 10:04:04 crc kubenswrapper[4841]: I0313 10:04:04.977389 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556604-8sknz" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.085386 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjz69\" (UniqueName: \"kubernetes.io/projected/f84e5588-63df-41c9-98c7-00d3fc4db098-kube-api-access-xjz69\") pod \"f84e5588-63df-41c9-98c7-00d3fc4db098\" (UID: \"f84e5588-63df-41c9-98c7-00d3fc4db098\") " Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.093395 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f84e5588-63df-41c9-98c7-00d3fc4db098-kube-api-access-xjz69" (OuterVolumeSpecName: "kube-api-access-xjz69") pod "f84e5588-63df-41c9-98c7-00d3fc4db098" (UID: "f84e5588-63df-41c9-98c7-00d3fc4db098"). InnerVolumeSpecName "kube-api-access-xjz69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.191885 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjz69\" (UniqueName: \"kubernetes.io/projected/f84e5588-63df-41c9-98c7-00d3fc4db098-kube-api-access-xjz69\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.222409 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.398029 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqzxd\" (UniqueName: \"kubernetes.io/projected/c4e8e7c7-61ec-4f17-aa35-66780c71735a-kube-api-access-vqzxd\") pod \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.398216 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-combined-ca-bundle\") pod \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.398264 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-internal-tls-certs\") pod \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.398407 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-public-tls-certs\") pod \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.398469 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-config-data\") pod \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.398507 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-scripts\") pod \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\" (UID: \"c4e8e7c7-61ec-4f17-aa35-66780c71735a\") " Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.404422 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e8e7c7-61ec-4f17-aa35-66780c71735a-kube-api-access-vqzxd" (OuterVolumeSpecName: "kube-api-access-vqzxd") pod "c4e8e7c7-61ec-4f17-aa35-66780c71735a" (UID: "c4e8e7c7-61ec-4f17-aa35-66780c71735a"). InnerVolumeSpecName "kube-api-access-vqzxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.406400 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-scripts" (OuterVolumeSpecName: "scripts") pod "c4e8e7c7-61ec-4f17-aa35-66780c71735a" (UID: "c4e8e7c7-61ec-4f17-aa35-66780c71735a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.457363 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c4e8e7c7-61ec-4f17-aa35-66780c71735a" (UID: "c4e8e7c7-61ec-4f17-aa35-66780c71735a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.460052 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c4e8e7c7-61ec-4f17-aa35-66780c71735a" (UID: "c4e8e7c7-61ec-4f17-aa35-66780c71735a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.495551 4841 generic.go:334] "Generic (PLEG): container finished" podID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerID="c5f39dd7085c62ebf12a9ffc76f9361cec35b3449ce3d5620b567c775aaeebe1" exitCode=137 Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.495659 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c4e8e7c7-61ec-4f17-aa35-66780c71735a","Type":"ContainerDied","Data":"c5f39dd7085c62ebf12a9ffc76f9361cec35b3449ce3d5620b567c775aaeebe1"} Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.495706 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c4e8e7c7-61ec-4f17-aa35-66780c71735a","Type":"ContainerDied","Data":"6cf0270f2ff8e6ced4f65ed944ebcda21ed221dcc4a4c284f4cd1ec27901f13c"} Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.495722 4841 scope.go:117] "RemoveContainer" containerID="d1b17ba81a867d2473cf0859557d2418bfbc4051fa4f1d552488545d4fb5eef7" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.495671 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.501572 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556604-8sknz" event={"ID":"f84e5588-63df-41c9-98c7-00d3fc4db098","Type":"ContainerDied","Data":"d1a43b299bb8388b551a0a7ab8edc1e0002b6aa18d866f8e08d2bd37895f598c"} Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.501600 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1a43b299bb8388b551a0a7ab8edc1e0002b6aa18d866f8e08d2bd37895f598c" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.501652 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556604-8sknz" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.502713 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.502751 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.502762 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.502773 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqzxd\" (UniqueName: \"kubernetes.io/projected/c4e8e7c7-61ec-4f17-aa35-66780c71735a-kube-api-access-vqzxd\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.506437 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-config-data" (OuterVolumeSpecName: "config-data") pod "c4e8e7c7-61ec-4f17-aa35-66780c71735a" (UID: "c4e8e7c7-61ec-4f17-aa35-66780c71735a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.510561 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4e8e7c7-61ec-4f17-aa35-66780c71735a" (UID: "c4e8e7c7-61ec-4f17-aa35-66780c71735a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.521123 4841 scope.go:117] "RemoveContainer" containerID="c5f39dd7085c62ebf12a9ffc76f9361cec35b3449ce3d5620b567c775aaeebe1" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.546660 4841 scope.go:117] "RemoveContainer" containerID="01148df9a621eb905ab6c92e2b7827e6590aa358e16603c785656acaaee5c402" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.577926 4841 scope.go:117] "RemoveContainer" containerID="73835459515e92b6fecc0b2b0b6798f41fedf7faff0780f2db183d4b540ebff6" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.604546 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.604581 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e8e7c7-61ec-4f17-aa35-66780c71735a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.605077 4841 scope.go:117] "RemoveContainer" containerID="d1b17ba81a867d2473cf0859557d2418bfbc4051fa4f1d552488545d4fb5eef7" Mar 13 10:04:05 crc kubenswrapper[4841]: E0313 10:04:05.605492 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1b17ba81a867d2473cf0859557d2418bfbc4051fa4f1d552488545d4fb5eef7\": container with ID starting with d1b17ba81a867d2473cf0859557d2418bfbc4051fa4f1d552488545d4fb5eef7 not found: ID does not exist" containerID="d1b17ba81a867d2473cf0859557d2418bfbc4051fa4f1d552488545d4fb5eef7" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.605525 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1b17ba81a867d2473cf0859557d2418bfbc4051fa4f1d552488545d4fb5eef7"} err="failed to get container status \"d1b17ba81a867d2473cf0859557d2418bfbc4051fa4f1d552488545d4fb5eef7\": rpc error: code = NotFound desc = could not find container \"d1b17ba81a867d2473cf0859557d2418bfbc4051fa4f1d552488545d4fb5eef7\": container with ID starting with d1b17ba81a867d2473cf0859557d2418bfbc4051fa4f1d552488545d4fb5eef7 not found: ID does not exist" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.605548 4841 scope.go:117] "RemoveContainer" containerID="c5f39dd7085c62ebf12a9ffc76f9361cec35b3449ce3d5620b567c775aaeebe1" Mar 13 10:04:05 crc kubenswrapper[4841]: E0313 10:04:05.605910 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5f39dd7085c62ebf12a9ffc76f9361cec35b3449ce3d5620b567c775aaeebe1\": container with ID starting with c5f39dd7085c62ebf12a9ffc76f9361cec35b3449ce3d5620b567c775aaeebe1 not found: ID does not exist" containerID="c5f39dd7085c62ebf12a9ffc76f9361cec35b3449ce3d5620b567c775aaeebe1" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.605931 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5f39dd7085c62ebf12a9ffc76f9361cec35b3449ce3d5620b567c775aaeebe1"} err="failed to get container status \"c5f39dd7085c62ebf12a9ffc76f9361cec35b3449ce3d5620b567c775aaeebe1\": rpc error: code = NotFound desc = could not find container \"c5f39dd7085c62ebf12a9ffc76f9361cec35b3449ce3d5620b567c775aaeebe1\": container with ID starting with c5f39dd7085c62ebf12a9ffc76f9361cec35b3449ce3d5620b567c775aaeebe1 not found: ID does not exist" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.605961 4841 scope.go:117] "RemoveContainer" containerID="01148df9a621eb905ab6c92e2b7827e6590aa358e16603c785656acaaee5c402" Mar 13 10:04:05 crc kubenswrapper[4841]: E0313 10:04:05.606154 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01148df9a621eb905ab6c92e2b7827e6590aa358e16603c785656acaaee5c402\": container with ID starting with 01148df9a621eb905ab6c92e2b7827e6590aa358e16603c785656acaaee5c402 not found: ID does not exist" containerID="01148df9a621eb905ab6c92e2b7827e6590aa358e16603c785656acaaee5c402" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.606188 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01148df9a621eb905ab6c92e2b7827e6590aa358e16603c785656acaaee5c402"} err="failed to get container status \"01148df9a621eb905ab6c92e2b7827e6590aa358e16603c785656acaaee5c402\": rpc error: code = NotFound desc = could not find container \"01148df9a621eb905ab6c92e2b7827e6590aa358e16603c785656acaaee5c402\": container with ID starting with 01148df9a621eb905ab6c92e2b7827e6590aa358e16603c785656acaaee5c402 not found: ID does not exist" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.606200 4841 scope.go:117] "RemoveContainer" containerID="73835459515e92b6fecc0b2b0b6798f41fedf7faff0780f2db183d4b540ebff6" Mar 13 10:04:05 crc kubenswrapper[4841]: E0313 10:04:05.606399 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73835459515e92b6fecc0b2b0b6798f41fedf7faff0780f2db183d4b540ebff6\": container with ID starting with 73835459515e92b6fecc0b2b0b6798f41fedf7faff0780f2db183d4b540ebff6 not found: ID does not exist" containerID="73835459515e92b6fecc0b2b0b6798f41fedf7faff0780f2db183d4b540ebff6" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.606441 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73835459515e92b6fecc0b2b0b6798f41fedf7faff0780f2db183d4b540ebff6"} err="failed to get container status \"73835459515e92b6fecc0b2b0b6798f41fedf7faff0780f2db183d4b540ebff6\": rpc error: code = NotFound desc = could not find container \"73835459515e92b6fecc0b2b0b6798f41fedf7faff0780f2db183d4b540ebff6\": container with ID starting with 73835459515e92b6fecc0b2b0b6798f41fedf7faff0780f2db183d4b540ebff6 not found: ID does not exist" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.863716 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.875423 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.885278 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 13 10:04:05 crc kubenswrapper[4841]: E0313 10:04:05.885697 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerName="aodh-listener" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.885719 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerName="aodh-listener" Mar 13 10:04:05 crc kubenswrapper[4841]: E0313 10:04:05.885732 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerName="aodh-api" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.885739 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerName="aodh-api" Mar 13 10:04:05 crc kubenswrapper[4841]: E0313 10:04:05.885756 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f84e5588-63df-41c9-98c7-00d3fc4db098" containerName="oc" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.885761 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84e5588-63df-41c9-98c7-00d3fc4db098" containerName="oc" Mar 13 10:04:05 crc kubenswrapper[4841]: E0313 10:04:05.885775 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerName="aodh-notifier" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.885781 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerName="aodh-notifier" Mar 13 10:04:05 crc kubenswrapper[4841]: E0313 10:04:05.885797 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerName="aodh-evaluator" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.885803 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerName="aodh-evaluator" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.885979 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerName="aodh-listener" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.885994 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f84e5588-63df-41c9-98c7-00d3fc4db098" containerName="oc" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.886013 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerName="aodh-evaluator" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.886027 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerName="aodh-notifier" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.886041 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" containerName="aodh-api" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.888011 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.890487 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.890626 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.890651 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.890666 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.892485 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-7tx6j" Mar 13 10:04:05 crc kubenswrapper[4841]: I0313 10:04:05.909691 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.013440 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-config-data\") pod \"aodh-0\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.013518 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-internal-tls-certs\") pod \"aodh-0\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.013607 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-scripts\") pod \"aodh-0\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.013634 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-public-tls-certs\") pod \"aodh-0\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.013667 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.013690 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mbc7\" (UniqueName: \"kubernetes.io/projected/d98b9428-afa5-438b-b9c2-608c7d4dcca4-kube-api-access-2mbc7\") pod \"aodh-0\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.018033 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4e8e7c7-61ec-4f17-aa35-66780c71735a" path="/var/lib/kubelet/pods/c4e8e7c7-61ec-4f17-aa35-66780c71735a/volumes" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.046295 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556598-r26f7"] Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.053484 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556598-r26f7"] Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.115056 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-scripts\") pod \"aodh-0\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.115157 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-public-tls-certs\") pod \"aodh-0\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.115254 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.115342 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mbc7\" (UniqueName: \"kubernetes.io/projected/d98b9428-afa5-438b-b9c2-608c7d4dcca4-kube-api-access-2mbc7\") pod \"aodh-0\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.115460 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-config-data\") pod \"aodh-0\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.115550 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-internal-tls-certs\") pod \"aodh-0\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.121948 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-internal-tls-certs\") pod \"aodh-0\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.121954 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-public-tls-certs\") pod \"aodh-0\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.125865 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-scripts\") pod \"aodh-0\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.134338 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-config-data\") pod \"aodh-0\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.137253 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mbc7\" (UniqueName: \"kubernetes.io/projected/d98b9428-afa5-438b-b9c2-608c7d4dcca4-kube-api-access-2mbc7\") pod \"aodh-0\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.141389 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.220840 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.306132 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.306280 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.308980 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:06 crc kubenswrapper[4841]: I0313 10:04:06.514830 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:07 crc kubenswrapper[4841]: I0313 10:04:07.133934 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 10:04:07 crc kubenswrapper[4841]: W0313 10:04:07.136226 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd98b9428_afa5_438b_b9c2_608c7d4dcca4.slice/crio-32dc919efb3c00c53844e945d0e9becef0cf56126b507b1db5e64b856270d054 WatchSource:0}: Error finding container 32dc919efb3c00c53844e945d0e9becef0cf56126b507b1db5e64b856270d054: Status 404 returned error can't find the container with id 32dc919efb3c00c53844e945d0e9becef0cf56126b507b1db5e64b856270d054 Mar 13 10:04:07 crc kubenswrapper[4841]: I0313 10:04:07.526073 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d98b9428-afa5-438b-b9c2-608c7d4dcca4","Type":"ContainerStarted","Data":"32dc919efb3c00c53844e945d0e9becef0cf56126b507b1db5e64b856270d054"} Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.046151 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c91b4ea0-e020-4088-ad70-aa8b6d6a7c15" path="/var/lib/kubelet/pods/c91b4ea0-e020-4088-ad70-aa8b6d6a7c15/volumes" Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.228545 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.228820 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="aef80b2d-e277-4d2c-8b3a-a80c19f46e36" containerName="openstackclient" containerID="cri-o://f2a271a45dbbddce4fc52202f72ade12d6c81716de77b7d079e05c30e483fad2" gracePeriod=2 Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.232383 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="aef80b2d-e277-4d2c-8b3a-a80c19f46e36" podUID="2d802466-0b65-4820-865c-8ae969af527f" Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.259508 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.269645 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 10:04:08 crc kubenswrapper[4841]: E0313 10:04:08.270171 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef80b2d-e277-4d2c-8b3a-a80c19f46e36" containerName="openstackclient" Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.270194 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef80b2d-e277-4d2c-8b3a-a80c19f46e36" containerName="openstackclient" Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.270499 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="aef80b2d-e277-4d2c-8b3a-a80c19f46e36" containerName="openstackclient" Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.271384 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.280397 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.365585 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d802466-0b65-4820-865c-8ae969af527f-openstack-config\") pod \"openstackclient\" (UID: \"2d802466-0b65-4820-865c-8ae969af527f\") " pod="openstack/openstackclient" Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.365958 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d802466-0b65-4820-865c-8ae969af527f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2d802466-0b65-4820-865c-8ae969af527f\") " pod="openstack/openstackclient" Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.366087 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbljq\" (UniqueName: \"kubernetes.io/projected/2d802466-0b65-4820-865c-8ae969af527f-kube-api-access-qbljq\") pod \"openstackclient\" (UID: \"2d802466-0b65-4820-865c-8ae969af527f\") " pod="openstack/openstackclient" Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.366184 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d802466-0b65-4820-865c-8ae969af527f-openstack-config-secret\") pod \"openstackclient\" (UID: \"2d802466-0b65-4820-865c-8ae969af527f\") " pod="openstack/openstackclient" Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.468021 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d802466-0b65-4820-865c-8ae969af527f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2d802466-0b65-4820-865c-8ae969af527f\") " pod="openstack/openstackclient" Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.468532 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbljq\" (UniqueName: \"kubernetes.io/projected/2d802466-0b65-4820-865c-8ae969af527f-kube-api-access-qbljq\") pod \"openstackclient\" (UID: \"2d802466-0b65-4820-865c-8ae969af527f\") " pod="openstack/openstackclient" Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.468649 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d802466-0b65-4820-865c-8ae969af527f-openstack-config-secret\") pod \"openstackclient\" (UID: \"2d802466-0b65-4820-865c-8ae969af527f\") " pod="openstack/openstackclient" Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.468807 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d802466-0b65-4820-865c-8ae969af527f-openstack-config\") pod \"openstackclient\" (UID: \"2d802466-0b65-4820-865c-8ae969af527f\") " pod="openstack/openstackclient" Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.469751 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d802466-0b65-4820-865c-8ae969af527f-openstack-config\") pod \"openstackclient\" (UID: \"2d802466-0b65-4820-865c-8ae969af527f\") " pod="openstack/openstackclient" Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.480110 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d802466-0b65-4820-865c-8ae969af527f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2d802466-0b65-4820-865c-8ae969af527f\") " pod="openstack/openstackclient" Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.485021 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d802466-0b65-4820-865c-8ae969af527f-openstack-config-secret\") pod \"openstackclient\" (UID: \"2d802466-0b65-4820-865c-8ae969af527f\") " pod="openstack/openstackclient" Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.485403 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbljq\" (UniqueName: \"kubernetes.io/projected/2d802466-0b65-4820-865c-8ae969af527f-kube-api-access-qbljq\") pod \"openstackclient\" (UID: \"2d802466-0b65-4820-865c-8ae969af527f\") " pod="openstack/openstackclient" Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.537830 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d98b9428-afa5-438b-b9c2-608c7d4dcca4","Type":"ContainerStarted","Data":"53c80d8663f36a290529544eb10126c339d7eadbfe2c4ea18315da622a7604e0"} Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.574955 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 13 10:04:08 crc kubenswrapper[4841]: I0313 10:04:08.607867 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 10:04:09 crc kubenswrapper[4841]: I0313 10:04:09.178838 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 10:04:09 crc kubenswrapper[4841]: W0313 10:04:09.182328 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d802466_0b65_4820_865c_8ae969af527f.slice/crio-51fb0bee5b4a77ff6b5e792efb2ade35786cf42c3ffa742418ffbbf37a4e85c6 WatchSource:0}: Error finding container 51fb0bee5b4a77ff6b5e792efb2ade35786cf42c3ffa742418ffbbf37a4e85c6: Status 404 returned error can't find the container with id 51fb0bee5b4a77ff6b5e792efb2ade35786cf42c3ffa742418ffbbf37a4e85c6 Mar 13 10:04:09 crc kubenswrapper[4841]: I0313 10:04:09.426904 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:04:09 crc kubenswrapper[4841]: I0313 10:04:09.427179 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" containerName="prometheus" containerID="cri-o://5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934" gracePeriod=600 Mar 13 10:04:09 crc kubenswrapper[4841]: I0313 10:04:09.427228 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" containerName="thanos-sidecar" containerID="cri-o://051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe" gracePeriod=600 Mar 13 10:04:09 crc kubenswrapper[4841]: I0313 10:04:09.427297 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" containerName="config-reloader" containerID="cri-o://e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6" gracePeriod=600 Mar 13 10:04:09 crc kubenswrapper[4841]: I0313 10:04:09.551794 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d98b9428-afa5-438b-b9c2-608c7d4dcca4","Type":"ContainerStarted","Data":"99994bf9a38291c8448d77e977265295d9c822b8fe6e815ce4db373ba0bc581d"} Mar 13 10:04:09 crc kubenswrapper[4841]: I0313 10:04:09.554706 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2d802466-0b65-4820-865c-8ae969af527f","Type":"ContainerStarted","Data":"53893bc4a3a3b1f03460fd7147f702e60758ec1b507fc4ea03a10ac594200764"} Mar 13 10:04:09 crc kubenswrapper[4841]: I0313 10:04:09.554748 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2d802466-0b65-4820-865c-8ae969af527f","Type":"ContainerStarted","Data":"51fb0bee5b4a77ff6b5e792efb2ade35786cf42c3ffa742418ffbbf37a4e85c6"} Mar 13 10:04:09 crc kubenswrapper[4841]: I0313 10:04:09.578777 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.5787571969999998 podStartE2EDuration="1.578757197s" podCreationTimestamp="2026-03-13 10:04:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:04:09.574750012 +0000 UTC m=+3132.304650213" watchObservedRunningTime="2026-03-13 10:04:09.578757197 +0000 UTC m=+3132.308657388" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.378518 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.507260 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/00ae5b61-6360-4fa5-ba8d-83852827f1b7-web-config\") pod \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.507352 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/00ae5b61-6360-4fa5-ba8d-83852827f1b7-prometheus-metric-storage-rulefiles-2\") pod \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.507413 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/00ae5b61-6360-4fa5-ba8d-83852827f1b7-config\") pod \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.507450 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.507478 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/00ae5b61-6360-4fa5-ba8d-83852827f1b7-prometheus-metric-storage-rulefiles-1\") pod \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.507545 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/00ae5b61-6360-4fa5-ba8d-83852827f1b7-config-out\") pod \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.507589 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/00ae5b61-6360-4fa5-ba8d-83852827f1b7-prometheus-metric-storage-rulefiles-0\") pod \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.507768 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/00ae5b61-6360-4fa5-ba8d-83852827f1b7-thanos-prometheus-http-client-file\") pod \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.507797 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwjf5\" (UniqueName: \"kubernetes.io/projected/00ae5b61-6360-4fa5-ba8d-83852827f1b7-kube-api-access-kwjf5\") pod \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.507854 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/00ae5b61-6360-4fa5-ba8d-83852827f1b7-tls-assets\") pod \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\" (UID: \"00ae5b61-6360-4fa5-ba8d-83852827f1b7\") " Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.508392 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ae5b61-6360-4fa5-ba8d-83852827f1b7-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "00ae5b61-6360-4fa5-ba8d-83852827f1b7" (UID: "00ae5b61-6360-4fa5-ba8d-83852827f1b7"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.508946 4841 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/00ae5b61-6360-4fa5-ba8d-83852827f1b7-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.511540 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ae5b61-6360-4fa5-ba8d-83852827f1b7-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "00ae5b61-6360-4fa5-ba8d-83852827f1b7" (UID: "00ae5b61-6360-4fa5-ba8d-83852827f1b7"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.513635 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ae5b61-6360-4fa5-ba8d-83852827f1b7-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "00ae5b61-6360-4fa5-ba8d-83852827f1b7" (UID: "00ae5b61-6360-4fa5-ba8d-83852827f1b7"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.538511 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ae5b61-6360-4fa5-ba8d-83852827f1b7-config" (OuterVolumeSpecName: "config") pod "00ae5b61-6360-4fa5-ba8d-83852827f1b7" (UID: "00ae5b61-6360-4fa5-ba8d-83852827f1b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.539089 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ae5b61-6360-4fa5-ba8d-83852827f1b7-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "00ae5b61-6360-4fa5-ba8d-83852827f1b7" (UID: "00ae5b61-6360-4fa5-ba8d-83852827f1b7"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.539210 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ae5b61-6360-4fa5-ba8d-83852827f1b7-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "00ae5b61-6360-4fa5-ba8d-83852827f1b7" (UID: "00ae5b61-6360-4fa5-ba8d-83852827f1b7"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.541900 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "00ae5b61-6360-4fa5-ba8d-83852827f1b7" (UID: "00ae5b61-6360-4fa5-ba8d-83852827f1b7"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.542240 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ae5b61-6360-4fa5-ba8d-83852827f1b7-kube-api-access-kwjf5" (OuterVolumeSpecName: "kube-api-access-kwjf5") pod "00ae5b61-6360-4fa5-ba8d-83852827f1b7" (UID: "00ae5b61-6360-4fa5-ba8d-83852827f1b7"). InnerVolumeSpecName "kube-api-access-kwjf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.545562 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00ae5b61-6360-4fa5-ba8d-83852827f1b7-config-out" (OuterVolumeSpecName: "config-out") pod "00ae5b61-6360-4fa5-ba8d-83852827f1b7" (UID: "00ae5b61-6360-4fa5-ba8d-83852827f1b7"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.553863 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ae5b61-6360-4fa5-ba8d-83852827f1b7-web-config" (OuterVolumeSpecName: "web-config") pod "00ae5b61-6360-4fa5-ba8d-83852827f1b7" (UID: "00ae5b61-6360-4fa5-ba8d-83852827f1b7"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.570746 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d98b9428-afa5-438b-b9c2-608c7d4dcca4","Type":"ContainerStarted","Data":"2cfcbe1246641bfa9491eb4362860b4466ee0b103a642e616208dd8a092e1afe"} Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.573842 4841 generic.go:334] "Generic (PLEG): container finished" podID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" containerID="051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe" exitCode=0 Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.573879 4841 generic.go:334] "Generic (PLEG): container finished" podID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" containerID="e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6" exitCode=0 Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.573889 4841 generic.go:334] "Generic (PLEG): container finished" podID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" containerID="5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934" exitCode=0 Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.573934 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"00ae5b61-6360-4fa5-ba8d-83852827f1b7","Type":"ContainerDied","Data":"051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe"} Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.573963 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"00ae5b61-6360-4fa5-ba8d-83852827f1b7","Type":"ContainerDied","Data":"e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6"} Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.573974 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"00ae5b61-6360-4fa5-ba8d-83852827f1b7","Type":"ContainerDied","Data":"5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934"} Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.573986 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"00ae5b61-6360-4fa5-ba8d-83852827f1b7","Type":"ContainerDied","Data":"0831f15052730c2307c831aaabe672db1384e1c6b18b2667af1e3adc2c70d415"} Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.574004 4841 scope.go:117] "RemoveContainer" containerID="051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.574159 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.579743 4841 generic.go:334] "Generic (PLEG): container finished" podID="aef80b2d-e277-4d2c-8b3a-a80c19f46e36" containerID="f2a271a45dbbddce4fc52202f72ade12d6c81716de77b7d079e05c30e483fad2" exitCode=137 Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.579873 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cc2a5787675aa54af58b82ccac10fb8e185121b30c810615b19beb58d1b1b81" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.611407 4841 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/00ae5b61-6360-4fa5-ba8d-83852827f1b7-web-config\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.611445 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/00ae5b61-6360-4fa5-ba8d-83852827f1b7-config\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.611490 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.611510 4841 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/00ae5b61-6360-4fa5-ba8d-83852827f1b7-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.611525 4841 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/00ae5b61-6360-4fa5-ba8d-83852827f1b7-config-out\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.611539 4841 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/00ae5b61-6360-4fa5-ba8d-83852827f1b7-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.611552 4841 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/00ae5b61-6360-4fa5-ba8d-83852827f1b7-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.611566 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwjf5\" (UniqueName: \"kubernetes.io/projected/00ae5b61-6360-4fa5-ba8d-83852827f1b7-kube-api-access-kwjf5\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.611577 4841 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/00ae5b61-6360-4fa5-ba8d-83852827f1b7-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.632639 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.657616 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.711410 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.713735 4841 scope.go:117] "RemoveContainer" containerID="e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.714387 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.742657 4841 scope.go:117] "RemoveContainer" containerID="5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.747227 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.762158 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:04:10 crc kubenswrapper[4841]: E0313 10:04:10.762676 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" containerName="prometheus" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.762701 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" containerName="prometheus" Mar 13 10:04:10 crc kubenswrapper[4841]: E0313 10:04:10.762720 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" containerName="init-config-reloader" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.762728 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" containerName="init-config-reloader" Mar 13 10:04:10 crc kubenswrapper[4841]: E0313 10:04:10.762744 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" containerName="config-reloader" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.762750 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" containerName="config-reloader" Mar 13 10:04:10 crc kubenswrapper[4841]: E0313 10:04:10.762759 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" containerName="thanos-sidecar" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.762765 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" containerName="thanos-sidecar" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.762972 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" containerName="config-reloader" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.762996 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" containerName="thanos-sidecar" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.763009 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" containerName="prometheus" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.764891 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.768201 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.768463 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-wmghm" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.768593 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.768785 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.768972 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.769238 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.769595 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.769655 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.777093 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.781018 4841 scope.go:117] "RemoveContainer" containerID="59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.790754 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.816633 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-openstack-config\") pod \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\" (UID: \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\") " Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.816759 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-openstack-config-secret\") pod \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\" (UID: \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\") " Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.816915 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-combined-ca-bundle\") pod \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\" (UID: \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\") " Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.817022 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s662g\" (UniqueName: \"kubernetes.io/projected/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-kube-api-access-s662g\") pod \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\" (UID: \"aef80b2d-e277-4d2c-8b3a-a80c19f46e36\") " Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.823297 4841 scope.go:117] "RemoveContainer" containerID="051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe" Mar 13 10:04:10 crc kubenswrapper[4841]: E0313 10:04:10.823749 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe\": container with ID starting with 051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe not found: ID does not exist" containerID="051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.823784 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe"} err="failed to get container status \"051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe\": rpc error: code = NotFound desc = could not find container \"051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe\": container with ID starting with 051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe not found: ID does not exist" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.823782 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-kube-api-access-s662g" (OuterVolumeSpecName: "kube-api-access-s662g") pod "aef80b2d-e277-4d2c-8b3a-a80c19f46e36" (UID: "aef80b2d-e277-4d2c-8b3a-a80c19f46e36"). InnerVolumeSpecName "kube-api-access-s662g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.823806 4841 scope.go:117] "RemoveContainer" containerID="e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6" Mar 13 10:04:10 crc kubenswrapper[4841]: E0313 10:04:10.824240 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6\": container with ID starting with e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6 not found: ID does not exist" containerID="e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.824285 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6"} err="failed to get container status \"e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6\": rpc error: code = NotFound desc = could not find container \"e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6\": container with ID starting with e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6 not found: ID does not exist" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.824310 4841 scope.go:117] "RemoveContainer" containerID="5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934" Mar 13 10:04:10 crc kubenswrapper[4841]: E0313 10:04:10.824697 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934\": container with ID starting with 5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934 not found: ID does not exist" containerID="5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.824899 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934"} err="failed to get container status \"5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934\": rpc error: code = NotFound desc = could not find container \"5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934\": container with ID starting with 5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934 not found: ID does not exist" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.825094 4841 scope.go:117] "RemoveContainer" containerID="59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9" Mar 13 10:04:10 crc kubenswrapper[4841]: E0313 10:04:10.828464 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9\": container with ID starting with 59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9 not found: ID does not exist" containerID="59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.828516 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9"} err="failed to get container status \"59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9\": rpc error: code = NotFound desc = could not find container \"59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9\": container with ID starting with 59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9 not found: ID does not exist" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.828750 4841 scope.go:117] "RemoveContainer" containerID="051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.830863 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe"} err="failed to get container status \"051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe\": rpc error: code = NotFound desc = could not find container \"051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe\": container with ID starting with 051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe not found: ID does not exist" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.831085 4841 scope.go:117] "RemoveContainer" containerID="e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.833044 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6"} err="failed to get container status \"e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6\": rpc error: code = NotFound desc = could not find container \"e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6\": container with ID starting with e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6 not found: ID does not exist" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.833079 4841 scope.go:117] "RemoveContainer" containerID="5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.834180 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934"} err="failed to get container status \"5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934\": rpc error: code = NotFound desc = could not find container \"5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934\": container with ID starting with 5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934 not found: ID does not exist" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.834205 4841 scope.go:117] "RemoveContainer" containerID="59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.835646 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9"} err="failed to get container status \"59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9\": rpc error: code = NotFound desc = could not find container \"59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9\": container with ID starting with 59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9 not found: ID does not exist" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.835748 4841 scope.go:117] "RemoveContainer" containerID="051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.837808 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe"} err="failed to get container status \"051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe\": rpc error: code = NotFound desc = could not find container \"051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe\": container with ID starting with 051fccfe69560ad04f01e8497e640b1b31dcf1b00fed7b46d03ad3a5196bc1fe not found: ID does not exist" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.837943 4841 scope.go:117] "RemoveContainer" containerID="e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.838818 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6"} err="failed to get container status \"e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6\": rpc error: code = NotFound desc = could not find container \"e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6\": container with ID starting with e4a4ea55cc9d03a59a79114a1fd932d5d0d6d4ffaeb8b27096d0e053dcba8fa6 not found: ID does not exist" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.838843 4841 scope.go:117] "RemoveContainer" containerID="5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.839753 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934"} err="failed to get container status \"5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934\": rpc error: code = NotFound desc = could not find container \"5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934\": container with ID starting with 5d1e358cc6a99e7c2dd5947ea95d4670a3c964612db108913a680a56c74a2934 not found: ID does not exist" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.839858 4841 scope.go:117] "RemoveContainer" containerID="59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.840881 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9"} err="failed to get container status \"59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9\": rpc error: code = NotFound desc = could not find container \"59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9\": container with ID starting with 59f3aedb13af307f02c75baa12f43d9f719e5114021d11db3c5862c2d556d5d9 not found: ID does not exist" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.844304 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aef80b2d-e277-4d2c-8b3a-a80c19f46e36" (UID: "aef80b2d-e277-4d2c-8b3a-a80c19f46e36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.859773 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "aef80b2d-e277-4d2c-8b3a-a80c19f46e36" (UID: "aef80b2d-e277-4d2c-8b3a-a80c19f46e36"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.875673 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "aef80b2d-e277-4d2c-8b3a-a80c19f46e36" (UID: "aef80b2d-e277-4d2c-8b3a-a80c19f46e36"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.920119 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.920763 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.920911 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjds5\" (UniqueName: \"kubernetes.io/projected/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-kube-api-access-cjds5\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.921231 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.921493 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.921604 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.921759 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.921911 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.922094 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.922215 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-config\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.922545 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.922682 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.922870 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.923036 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.923117 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.923829 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:10 crc kubenswrapper[4841]: I0313 10:04:10.924317 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s662g\" (UniqueName: \"kubernetes.io/projected/aef80b2d-e277-4d2c-8b3a-a80c19f46e36-kube-api-access-s662g\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.025577 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.025628 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.025676 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjds5\" (UniqueName: \"kubernetes.io/projected/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-kube-api-access-cjds5\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.025723 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.025743 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.025759 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.025776 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.025808 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.025834 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.025849 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-config\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.025864 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.025888 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.025927 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.026720 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.028084 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.028192 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.028338 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.031000 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.031233 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.031299 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.031824 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.032794 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.033505 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.033925 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-config\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.050621 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjds5\" (UniqueName: \"kubernetes.io/projected/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-kube-api-access-cjds5\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.057307 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.077188 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"prometheus-metric-storage-0\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.251481 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.599730 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d98b9428-afa5-438b-b9c2-608c7d4dcca4","Type":"ContainerStarted","Data":"88e601bc311651f426d1247b884652536ce191126b2ed6e0e0a476004c4af5a3"} Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.599800 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerName="aodh-api" containerID="cri-o://53c80d8663f36a290529544eb10126c339d7eadbfe2c4ea18315da622a7604e0" gracePeriod=30 Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.599853 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerName="aodh-evaluator" containerID="cri-o://99994bf9a38291c8448d77e977265295d9c822b8fe6e815ce4db373ba0bc581d" gracePeriod=30 Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.599813 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerName="aodh-listener" containerID="cri-o://88e601bc311651f426d1247b884652536ce191126b2ed6e0e0a476004c4af5a3" gracePeriod=30 Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.599872 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerName="aodh-notifier" containerID="cri-o://2cfcbe1246641bfa9491eb4362860b4466ee0b103a642e616208dd8a092e1afe" gracePeriod=30 Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.601314 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.606375 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.642252 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.93470649 podStartE2EDuration="6.642229689s" podCreationTimestamp="2026-03-13 10:04:05 +0000 UTC" firstStartedPulling="2026-03-13 10:04:07.138323494 +0000 UTC m=+3129.868223685" lastFinishedPulling="2026-03-13 10:04:10.845846693 +0000 UTC m=+3133.575746884" observedRunningTime="2026-03-13 10:04:11.625623887 +0000 UTC m=+3134.355524078" watchObservedRunningTime="2026-03-13 10:04:11.642229689 +0000 UTC m=+3134.372129880" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.703783 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="aef80b2d-e277-4d2c-8b3a-a80c19f46e36" podUID="2d802466-0b65-4820-865c-8ae969af527f" Mar 13 10:04:11 crc kubenswrapper[4841]: I0313 10:04:11.994889 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:04:11 crc kubenswrapper[4841]: E0313 10:04:11.995231 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:04:12 crc kubenswrapper[4841]: I0313 10:04:12.021165 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00ae5b61-6360-4fa5-ba8d-83852827f1b7" path="/var/lib/kubelet/pods/00ae5b61-6360-4fa5-ba8d-83852827f1b7/volumes" Mar 13 10:04:12 crc kubenswrapper[4841]: I0313 10:04:12.022338 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aef80b2d-e277-4d2c-8b3a-a80c19f46e36" path="/var/lib/kubelet/pods/aef80b2d-e277-4d2c-8b3a-a80c19f46e36/volumes" Mar 13 10:04:12 crc kubenswrapper[4841]: I0313 10:04:12.615604 4841 generic.go:334] "Generic (PLEG): container finished" podID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerID="99994bf9a38291c8448d77e977265295d9c822b8fe6e815ce4db373ba0bc581d" exitCode=0 Mar 13 10:04:12 crc kubenswrapper[4841]: I0313 10:04:12.616119 4841 generic.go:334] "Generic (PLEG): container finished" podID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerID="53c80d8663f36a290529544eb10126c339d7eadbfe2c4ea18315da622a7604e0" exitCode=0 Mar 13 10:04:12 crc kubenswrapper[4841]: I0313 10:04:12.615684 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d98b9428-afa5-438b-b9c2-608c7d4dcca4","Type":"ContainerDied","Data":"99994bf9a38291c8448d77e977265295d9c822b8fe6e815ce4db373ba0bc581d"} Mar 13 10:04:12 crc kubenswrapper[4841]: I0313 10:04:12.616249 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d98b9428-afa5-438b-b9c2-608c7d4dcca4","Type":"ContainerDied","Data":"53c80d8663f36a290529544eb10126c339d7eadbfe2c4ea18315da622a7604e0"} Mar 13 10:04:12 crc kubenswrapper[4841]: I0313 10:04:12.617832 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5","Type":"ContainerStarted","Data":"9a8845bc25f69d4175454d8338887858d9c630340bda90e917f34444a9da0875"} Mar 13 10:04:16 crc kubenswrapper[4841]: I0313 10:04:16.668892 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5","Type":"ContainerStarted","Data":"5b1abe2ca9a307dafd568db9ee399521aa6137e35592dd045de8d3eb2746a0bb"} Mar 13 10:04:21 crc kubenswrapper[4841]: I0313 10:04:21.731932 4841 generic.go:334] "Generic (PLEG): container finished" podID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" containerID="5b1abe2ca9a307dafd568db9ee399521aa6137e35592dd045de8d3eb2746a0bb" exitCode=0 Mar 13 10:04:21 crc kubenswrapper[4841]: I0313 10:04:21.732078 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5","Type":"ContainerDied","Data":"5b1abe2ca9a307dafd568db9ee399521aa6137e35592dd045de8d3eb2746a0bb"} Mar 13 10:04:22 crc kubenswrapper[4841]: I0313 10:04:22.747556 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5","Type":"ContainerStarted","Data":"49a840bbe67cd08405f94616ea0b0458821f3b3d6da4eb13dd693d86c5e24bfe"} Mar 13 10:04:24 crc kubenswrapper[4841]: I0313 10:04:24.995058 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:04:24 crc kubenswrapper[4841]: E0313 10:04:24.995622 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:04:25 crc kubenswrapper[4841]: I0313 10:04:25.778081 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5","Type":"ContainerStarted","Data":"c9a8b5921f638188a33e5927bff51c3362770cb145423906ff0e3bd197297b1b"} Mar 13 10:04:25 crc kubenswrapper[4841]: I0313 10:04:25.778642 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5","Type":"ContainerStarted","Data":"b6f6c0f29d9b49a810b072854d10d435273861075cef4e18db0a9a3225ee0a85"} Mar 13 10:04:25 crc kubenswrapper[4841]: I0313 10:04:25.809603 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.809572197 podStartE2EDuration="15.809572197s" podCreationTimestamp="2026-03-13 10:04:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:04:25.802876326 +0000 UTC m=+3148.532776517" watchObservedRunningTime="2026-03-13 10:04:25.809572197 +0000 UTC m=+3148.539472388" Mar 13 10:04:26 crc kubenswrapper[4841]: I0313 10:04:26.251699 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:26 crc kubenswrapper[4841]: I0313 10:04:26.251779 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:26 crc kubenswrapper[4841]: I0313 10:04:26.257836 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:26 crc kubenswrapper[4841]: I0313 10:04:26.791522 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 13 10:04:39 crc kubenswrapper[4841]: I0313 10:04:39.994973 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:04:39 crc kubenswrapper[4841]: E0313 10:04:39.995958 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:04:41 crc kubenswrapper[4841]: I0313 10:04:41.948292 4841 generic.go:334] "Generic (PLEG): container finished" podID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerID="88e601bc311651f426d1247b884652536ce191126b2ed6e0e0a476004c4af5a3" exitCode=137 Mar 13 10:04:41 crc kubenswrapper[4841]: I0313 10:04:41.948667 4841 generic.go:334] "Generic (PLEG): container finished" podID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerID="2cfcbe1246641bfa9491eb4362860b4466ee0b103a642e616208dd8a092e1afe" exitCode=137 Mar 13 10:04:41 crc kubenswrapper[4841]: I0313 10:04:41.948385 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d98b9428-afa5-438b-b9c2-608c7d4dcca4","Type":"ContainerDied","Data":"88e601bc311651f426d1247b884652536ce191126b2ed6e0e0a476004c4af5a3"} Mar 13 10:04:41 crc kubenswrapper[4841]: I0313 10:04:41.948715 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d98b9428-afa5-438b-b9c2-608c7d4dcca4","Type":"ContainerDied","Data":"2cfcbe1246641bfa9491eb4362860b4466ee0b103a642e616208dd8a092e1afe"} Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.113406 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.303029 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-scripts\") pod \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.303376 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-combined-ca-bundle\") pod \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.303538 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-internal-tls-certs\") pod \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.303638 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-public-tls-certs\") pod \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.303668 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-config-data\") pod \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.303705 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mbc7\" (UniqueName: \"kubernetes.io/projected/d98b9428-afa5-438b-b9c2-608c7d4dcca4-kube-api-access-2mbc7\") pod \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\" (UID: \"d98b9428-afa5-438b-b9c2-608c7d4dcca4\") " Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.309213 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-scripts" (OuterVolumeSpecName: "scripts") pod "d98b9428-afa5-438b-b9c2-608c7d4dcca4" (UID: "d98b9428-afa5-438b-b9c2-608c7d4dcca4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.314063 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d98b9428-afa5-438b-b9c2-608c7d4dcca4-kube-api-access-2mbc7" (OuterVolumeSpecName: "kube-api-access-2mbc7") pod "d98b9428-afa5-438b-b9c2-608c7d4dcca4" (UID: "d98b9428-afa5-438b-b9c2-608c7d4dcca4"). InnerVolumeSpecName "kube-api-access-2mbc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.361568 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d98b9428-afa5-438b-b9c2-608c7d4dcca4" (UID: "d98b9428-afa5-438b-b9c2-608c7d4dcca4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.379427 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d98b9428-afa5-438b-b9c2-608c7d4dcca4" (UID: "d98b9428-afa5-438b-b9c2-608c7d4dcca4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.405624 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.405668 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mbc7\" (UniqueName: \"kubernetes.io/projected/d98b9428-afa5-438b-b9c2-608c7d4dcca4-kube-api-access-2mbc7\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.405681 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.405695 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.432074 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d98b9428-afa5-438b-b9c2-608c7d4dcca4" (UID: "d98b9428-afa5-438b-b9c2-608c7d4dcca4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.432857 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-config-data" (OuterVolumeSpecName: "config-data") pod "d98b9428-afa5-438b-b9c2-608c7d4dcca4" (UID: "d98b9428-afa5-438b-b9c2-608c7d4dcca4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.507248 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.507602 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d98b9428-afa5-438b-b9c2-608c7d4dcca4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.964244 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d98b9428-afa5-438b-b9c2-608c7d4dcca4","Type":"ContainerDied","Data":"32dc919efb3c00c53844e945d0e9becef0cf56126b507b1db5e64b856270d054"} Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.964672 4841 scope.go:117] "RemoveContainer" containerID="88e601bc311651f426d1247b884652536ce191126b2ed6e0e0a476004c4af5a3" Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.964511 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 10:04:42 crc kubenswrapper[4841]: I0313 10:04:42.989045 4841 scope.go:117] "RemoveContainer" containerID="2cfcbe1246641bfa9491eb4362860b4466ee0b103a642e616208dd8a092e1afe" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.014136 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.041527 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.042940 4841 scope.go:117] "RemoveContainer" containerID="99994bf9a38291c8448d77e977265295d9c822b8fe6e815ce4db373ba0bc581d" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.051604 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 13 10:04:43 crc kubenswrapper[4841]: E0313 10:04:43.052222 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerName="aodh-api" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.052320 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerName="aodh-api" Mar 13 10:04:43 crc kubenswrapper[4841]: E0313 10:04:43.052394 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerName="aodh-evaluator" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.052461 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerName="aodh-evaluator" Mar 13 10:04:43 crc kubenswrapper[4841]: E0313 10:04:43.052518 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerName="aodh-listener" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.052572 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerName="aodh-listener" Mar 13 10:04:43 crc kubenswrapper[4841]: E0313 10:04:43.052638 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerName="aodh-notifier" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.052715 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerName="aodh-notifier" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.052975 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerName="aodh-notifier" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.053052 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerName="aodh-api" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.053112 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerName="aodh-listener" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.053178 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" containerName="aodh-evaluator" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.055333 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.060174 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.060232 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.060555 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-7tx6j" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.061006 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.061010 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.066064 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.098685 4841 scope.go:117] "RemoveContainer" containerID="53c80d8663f36a290529544eb10126c339d7eadbfe2c4ea18315da622a7604e0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.228294 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-config-data\") pod \"aodh-0\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.228370 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-scripts\") pod \"aodh-0\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.228407 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8cp4\" (UniqueName: \"kubernetes.io/projected/7d543631-bf8a-46c4-8808-ed0e3c563189-kube-api-access-f8cp4\") pod \"aodh-0\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.228517 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-combined-ca-bundle\") pod \"aodh-0\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.228637 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-public-tls-certs\") pod \"aodh-0\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.228711 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-internal-tls-certs\") pod \"aodh-0\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.330758 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-public-tls-certs\") pod \"aodh-0\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.330827 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-internal-tls-certs\") pod \"aodh-0\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.330917 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-config-data\") pod \"aodh-0\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.330937 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-scripts\") pod \"aodh-0\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.330954 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8cp4\" (UniqueName: \"kubernetes.io/projected/7d543631-bf8a-46c4-8808-ed0e3c563189-kube-api-access-f8cp4\") pod \"aodh-0\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.330983 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-combined-ca-bundle\") pod \"aodh-0\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.337505 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-scripts\") pod \"aodh-0\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.337965 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-config-data\") pod \"aodh-0\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.338713 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-combined-ca-bundle\") pod \"aodh-0\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.338733 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-internal-tls-certs\") pod \"aodh-0\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.339077 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-public-tls-certs\") pod \"aodh-0\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.350862 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8cp4\" (UniqueName: \"kubernetes.io/projected/7d543631-bf8a-46c4-8808-ed0e3c563189-kube-api-access-f8cp4\") pod \"aodh-0\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.395085 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.849984 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 10:04:43 crc kubenswrapper[4841]: W0313 10:04:43.869324 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d543631_bf8a_46c4_8808_ed0e3c563189.slice/crio-da942202f4e46bd5870253f50606ee0870a07a92c0eb051f9de39e8043d42aeb WatchSource:0}: Error finding container da942202f4e46bd5870253f50606ee0870a07a92c0eb051f9de39e8043d42aeb: Status 404 returned error can't find the container with id da942202f4e46bd5870253f50606ee0870a07a92c0eb051f9de39e8043d42aeb Mar 13 10:04:43 crc kubenswrapper[4841]: I0313 10:04:43.977578 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7d543631-bf8a-46c4-8808-ed0e3c563189","Type":"ContainerStarted","Data":"da942202f4e46bd5870253f50606ee0870a07a92c0eb051f9de39e8043d42aeb"} Mar 13 10:04:44 crc kubenswrapper[4841]: I0313 10:04:44.009493 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d98b9428-afa5-438b-b9c2-608c7d4dcca4" path="/var/lib/kubelet/pods/d98b9428-afa5-438b-b9c2-608c7d4dcca4/volumes" Mar 13 10:04:44 crc kubenswrapper[4841]: I0313 10:04:44.987350 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7d543631-bf8a-46c4-8808-ed0e3c563189","Type":"ContainerStarted","Data":"36ad1ad9997573d72b2f9127d3165d7e2b06e4a6c4331c8d606d1b320fab6f13"} Mar 13 10:04:46 crc kubenswrapper[4841]: I0313 10:04:46.007723 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7d543631-bf8a-46c4-8808-ed0e3c563189","Type":"ContainerStarted","Data":"3af4ec93db84e46d7004fd103f200655792e0f4840a06679d756f0a4c974cb1f"} Mar 13 10:04:46 crc kubenswrapper[4841]: I0313 10:04:46.008054 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7d543631-bf8a-46c4-8808-ed0e3c563189","Type":"ContainerStarted","Data":"3b2338cfdfe4d9e5129e8ada951e6926e25db525ffc008da8d01e0d424403d06"} Mar 13 10:04:47 crc kubenswrapper[4841]: I0313 10:04:47.011675 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7d543631-bf8a-46c4-8808-ed0e3c563189","Type":"ContainerStarted","Data":"962a89540a477c8177f10d4e2937b469979eb748031b076446c1ab4689c0b767"} Mar 13 10:04:47 crc kubenswrapper[4841]: I0313 10:04:47.059939 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.6263223199999999 podStartE2EDuration="4.059908749s" podCreationTimestamp="2026-03-13 10:04:43 +0000 UTC" firstStartedPulling="2026-03-13 10:04:43.876780586 +0000 UTC m=+3166.606680777" lastFinishedPulling="2026-03-13 10:04:46.310367015 +0000 UTC m=+3169.040267206" observedRunningTime="2026-03-13 10:04:47.042086119 +0000 UTC m=+3169.771986310" watchObservedRunningTime="2026-03-13 10:04:47.059908749 +0000 UTC m=+3169.789808950" Mar 13 10:04:51 crc kubenswrapper[4841]: I0313 10:04:51.994559 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:04:51 crc kubenswrapper[4841]: E0313 10:04:51.995253 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:05:05 crc kubenswrapper[4841]: I0313 10:05:05.482181 4841 scope.go:117] "RemoveContainer" containerID="f2a271a45dbbddce4fc52202f72ade12d6c81716de77b7d079e05c30e483fad2" Mar 13 10:05:05 crc kubenswrapper[4841]: I0313 10:05:05.503761 4841 scope.go:117] "RemoveContainer" containerID="be07db9d729e7d306a78aac9d35d087678f75114132b0f88a705b1aa5d17a28a" Mar 13 10:05:05 crc kubenswrapper[4841]: I0313 10:05:05.995171 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:05:05 crc kubenswrapper[4841]: E0313 10:05:05.995645 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:05:16 crc kubenswrapper[4841]: I0313 10:05:16.996056 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:05:16 crc kubenswrapper[4841]: E0313 10:05:16.996926 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:05:30 crc kubenswrapper[4841]: I0313 10:05:30.995513 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:05:30 crc kubenswrapper[4841]: E0313 10:05:30.996682 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:05:44 crc kubenswrapper[4841]: I0313 10:05:44.995284 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:05:44 crc kubenswrapper[4841]: E0313 10:05:44.996005 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:05:56 crc kubenswrapper[4841]: I0313 10:05:56.995920 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:05:56 crc kubenswrapper[4841]: E0313 10:05:56.997113 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:06:00 crc kubenswrapper[4841]: I0313 10:06:00.154848 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556606-q6bf4"] Mar 13 10:06:00 crc kubenswrapper[4841]: I0313 10:06:00.157135 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556606-q6bf4" Mar 13 10:06:00 crc kubenswrapper[4841]: I0313 10:06:00.159771 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 10:06:00 crc kubenswrapper[4841]: I0313 10:06:00.159907 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 10:06:00 crc kubenswrapper[4841]: I0313 10:06:00.159912 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 10:06:00 crc kubenswrapper[4841]: I0313 10:06:00.170691 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556606-q6bf4"] Mar 13 10:06:00 crc kubenswrapper[4841]: I0313 10:06:00.253437 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6zc5\" (UniqueName: \"kubernetes.io/projected/310c034a-4453-4712-b8c6-cd6ab0ff5ab7-kube-api-access-c6zc5\") pod \"auto-csr-approver-29556606-q6bf4\" (UID: \"310c034a-4453-4712-b8c6-cd6ab0ff5ab7\") " pod="openshift-infra/auto-csr-approver-29556606-q6bf4" Mar 13 10:06:00 crc kubenswrapper[4841]: I0313 10:06:00.358505 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6zc5\" (UniqueName: \"kubernetes.io/projected/310c034a-4453-4712-b8c6-cd6ab0ff5ab7-kube-api-access-c6zc5\") pod \"auto-csr-approver-29556606-q6bf4\" (UID: \"310c034a-4453-4712-b8c6-cd6ab0ff5ab7\") " pod="openshift-infra/auto-csr-approver-29556606-q6bf4" Mar 13 10:06:00 crc kubenswrapper[4841]: I0313 10:06:00.378244 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6zc5\" (UniqueName: \"kubernetes.io/projected/310c034a-4453-4712-b8c6-cd6ab0ff5ab7-kube-api-access-c6zc5\") pod \"auto-csr-approver-29556606-q6bf4\" (UID: \"310c034a-4453-4712-b8c6-cd6ab0ff5ab7\") " pod="openshift-infra/auto-csr-approver-29556606-q6bf4" Mar 13 10:06:00 crc kubenswrapper[4841]: I0313 10:06:00.476128 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556606-q6bf4" Mar 13 10:06:00 crc kubenswrapper[4841]: I0313 10:06:00.956781 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556606-q6bf4"] Mar 13 10:06:01 crc kubenswrapper[4841]: I0313 10:06:01.755241 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556606-q6bf4" event={"ID":"310c034a-4453-4712-b8c6-cd6ab0ff5ab7","Type":"ContainerStarted","Data":"7250ad5230b61df3412b1d24433e8df560c0bca033d1f99d9be3f28fbba00a88"} Mar 13 10:06:02 crc kubenswrapper[4841]: I0313 10:06:02.788055 4841 generic.go:334] "Generic (PLEG): container finished" podID="310c034a-4453-4712-b8c6-cd6ab0ff5ab7" containerID="8f9f584ed48aa030cfb191ada88a9a3d3295ab13ffd06c3fa9c956721d9a7699" exitCode=0 Mar 13 10:06:02 crc kubenswrapper[4841]: I0313 10:06:02.788695 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556606-q6bf4" event={"ID":"310c034a-4453-4712-b8c6-cd6ab0ff5ab7","Type":"ContainerDied","Data":"8f9f584ed48aa030cfb191ada88a9a3d3295ab13ffd06c3fa9c956721d9a7699"} Mar 13 10:06:04 crc kubenswrapper[4841]: I0313 10:06:04.137064 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556606-q6bf4" Mar 13 10:06:04 crc kubenswrapper[4841]: I0313 10:06:04.234475 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6zc5\" (UniqueName: \"kubernetes.io/projected/310c034a-4453-4712-b8c6-cd6ab0ff5ab7-kube-api-access-c6zc5\") pod \"310c034a-4453-4712-b8c6-cd6ab0ff5ab7\" (UID: \"310c034a-4453-4712-b8c6-cd6ab0ff5ab7\") " Mar 13 10:06:04 crc kubenswrapper[4841]: I0313 10:06:04.240427 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/310c034a-4453-4712-b8c6-cd6ab0ff5ab7-kube-api-access-c6zc5" (OuterVolumeSpecName: "kube-api-access-c6zc5") pod "310c034a-4453-4712-b8c6-cd6ab0ff5ab7" (UID: "310c034a-4453-4712-b8c6-cd6ab0ff5ab7"). InnerVolumeSpecName "kube-api-access-c6zc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:06:04 crc kubenswrapper[4841]: I0313 10:06:04.337490 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6zc5\" (UniqueName: \"kubernetes.io/projected/310c034a-4453-4712-b8c6-cd6ab0ff5ab7-kube-api-access-c6zc5\") on node \"crc\" DevicePath \"\"" Mar 13 10:06:04 crc kubenswrapper[4841]: I0313 10:06:04.813376 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556606-q6bf4" event={"ID":"310c034a-4453-4712-b8c6-cd6ab0ff5ab7","Type":"ContainerDied","Data":"7250ad5230b61df3412b1d24433e8df560c0bca033d1f99d9be3f28fbba00a88"} Mar 13 10:06:04 crc kubenswrapper[4841]: I0313 10:06:04.814008 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7250ad5230b61df3412b1d24433e8df560c0bca033d1f99d9be3f28fbba00a88" Mar 13 10:06:04 crc kubenswrapper[4841]: I0313 10:06:04.813424 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556606-q6bf4" Mar 13 10:06:04 crc kubenswrapper[4841]: E0313 10:06:04.928952 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod310c034a_4453_4712_b8c6_cd6ab0ff5ab7.slice\": RecentStats: unable to find data in memory cache]" Mar 13 10:06:05 crc kubenswrapper[4841]: I0313 10:06:05.199916 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556600-mjvph"] Mar 13 10:06:05 crc kubenswrapper[4841]: I0313 10:06:05.209516 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556600-mjvph"] Mar 13 10:06:06 crc kubenswrapper[4841]: I0313 10:06:06.007448 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff89fe5-b7c2-4581-8840-c12adc8826bd" path="/var/lib/kubelet/pods/dff89fe5-b7c2-4581-8840-c12adc8826bd/volumes" Mar 13 10:06:11 crc kubenswrapper[4841]: I0313 10:06:11.995042 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:06:11 crc kubenswrapper[4841]: E0313 10:06:11.995899 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:06:26 crc kubenswrapper[4841]: I0313 10:06:26.013257 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:06:26 crc kubenswrapper[4841]: E0313 10:06:26.014301 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:06:32 crc kubenswrapper[4841]: I0313 10:06:32.697708 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-8485bdb9db-mf5lp_17824e5f-18b3-46c0-910a-56e5529e09c3/manager/0.log" Mar 13 10:06:34 crc kubenswrapper[4841]: I0313 10:06:34.558393 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:06:34 crc kubenswrapper[4841]: I0313 10:06:34.559314 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" containerName="prometheus" containerID="cri-o://49a840bbe67cd08405f94616ea0b0458821f3b3d6da4eb13dd693d86c5e24bfe" gracePeriod=600 Mar 13 10:06:34 crc kubenswrapper[4841]: I0313 10:06:34.559440 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" containerName="thanos-sidecar" containerID="cri-o://c9a8b5921f638188a33e5927bff51c3362770cb145423906ff0e3bd197297b1b" gracePeriod=600 Mar 13 10:06:34 crc kubenswrapper[4841]: I0313 10:06:34.559527 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" containerName="config-reloader" containerID="cri-o://b6f6c0f29d9b49a810b072854d10d435273861075cef4e18db0a9a3225ee0a85" gracePeriod=600 Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.106737 4841 generic.go:334] "Generic (PLEG): container finished" podID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" containerID="c9a8b5921f638188a33e5927bff51c3362770cb145423906ff0e3bd197297b1b" exitCode=0 Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.107386 4841 generic.go:334] "Generic (PLEG): container finished" podID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" containerID="b6f6c0f29d9b49a810b072854d10d435273861075cef4e18db0a9a3225ee0a85" exitCode=0 Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.107483 4841 generic.go:334] "Generic (PLEG): container finished" podID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" containerID="49a840bbe67cd08405f94616ea0b0458821f3b3d6da4eb13dd693d86c5e24bfe" exitCode=0 Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.107507 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5","Type":"ContainerDied","Data":"c9a8b5921f638188a33e5927bff51c3362770cb145423906ff0e3bd197297b1b"} Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.107664 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5","Type":"ContainerDied","Data":"b6f6c0f29d9b49a810b072854d10d435273861075cef4e18db0a9a3225ee0a85"} Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.107753 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5","Type":"ContainerDied","Data":"49a840bbe67cd08405f94616ea0b0458821f3b3d6da4eb13dd693d86c5e24bfe"} Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.454419 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.619250 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.619321 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-config-out\") pod \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.619376 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-prometheus-metric-storage-rulefiles-0\") pod \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.619450 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-prometheus-metric-storage-rulefiles-1\") pod \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.619565 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-tls-assets\") pod \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.619671 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.619931 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" (UID: "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.619948 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" (UID: "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.620130 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-secret-combined-ca-bundle\") pod \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.620192 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjds5\" (UniqueName: \"kubernetes.io/projected/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-kube-api-access-cjds5\") pod \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.620243 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-thanos-prometheus-http-client-file\") pod \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.620370 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-config\") pod \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.620396 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-prometheus-metric-storage-rulefiles-2\") pod \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.620453 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-web-config\") pod \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.620489 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\" (UID: \"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5\") " Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.621293 4841 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.621378 4841 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.625810 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" (UID: "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.625889 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" (UID: "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.626027 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" (UID: "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.626173 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-config-out" (OuterVolumeSpecName: "config-out") pod "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" (UID: "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.626592 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-config" (OuterVolumeSpecName: "config") pod "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" (UID: "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.626613 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" (UID: "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.627181 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" (UID: "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.628776 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-kube-api-access-cjds5" (OuterVolumeSpecName: "kube-api-access-cjds5") pod "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" (UID: "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5"). InnerVolumeSpecName "kube-api-access-cjds5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.629148 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" (UID: "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.633676 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" (UID: "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.710507 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-web-config" (OuterVolumeSpecName: "web-config") pod "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" (UID: "c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.722822 4841 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.723026 4841 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.723106 4841 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-config-out\") on node \"crc\" DevicePath \"\"" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.723179 4841 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.723289 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.723413 4841 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.723492 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjds5\" (UniqueName: \"kubernetes.io/projected/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-kube-api-access-cjds5\") on node \"crc\" DevicePath \"\"" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.723560 4841 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.723616 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-config\") on node \"crc\" DevicePath \"\"" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.723696 4841 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.723772 4841 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5-web-config\") on node \"crc\" DevicePath \"\"" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.742091 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 13 10:06:35 crc kubenswrapper[4841]: I0313 10:06:35.825476 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.120615 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5","Type":"ContainerDied","Data":"9a8845bc25f69d4175454d8338887858d9c630340bda90e917f34444a9da0875"} Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.120977 4841 scope.go:117] "RemoveContainer" containerID="c9a8b5921f638188a33e5927bff51c3362770cb145423906ff0e3bd197297b1b" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.121147 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.150434 4841 scope.go:117] "RemoveContainer" containerID="b6f6c0f29d9b49a810b072854d10d435273861075cef4e18db0a9a3225ee0a85" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.152470 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.169553 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.172467 4841 scope.go:117] "RemoveContainer" containerID="49a840bbe67cd08405f94616ea0b0458821f3b3d6da4eb13dd693d86c5e24bfe" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.226836 4841 scope.go:117] "RemoveContainer" containerID="5b1abe2ca9a307dafd568db9ee399521aa6137e35592dd045de8d3eb2746a0bb" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.978749 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:06:36 crc kubenswrapper[4841]: E0313 10:06:36.979202 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="310c034a-4453-4712-b8c6-cd6ab0ff5ab7" containerName="oc" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.979217 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="310c034a-4453-4712-b8c6-cd6ab0ff5ab7" containerName="oc" Mar 13 10:06:36 crc kubenswrapper[4841]: E0313 10:06:36.979238 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" containerName="init-config-reloader" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.979247 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" containerName="init-config-reloader" Mar 13 10:06:36 crc kubenswrapper[4841]: E0313 10:06:36.979279 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" containerName="prometheus" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.979288 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" containerName="prometheus" Mar 13 10:06:36 crc kubenswrapper[4841]: E0313 10:06:36.979316 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" containerName="config-reloader" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.979324 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" containerName="config-reloader" Mar 13 10:06:36 crc kubenswrapper[4841]: E0313 10:06:36.979339 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" containerName="thanos-sidecar" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.979345 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" containerName="thanos-sidecar" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.979611 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" containerName="thanos-sidecar" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.979623 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" containerName="prometheus" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.979657 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="310c034a-4453-4712-b8c6-cd6ab0ff5ab7" containerName="oc" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.979673 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" containerName="config-reloader" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.981907 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.983760 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-wmghm" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.984305 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.984305 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.985158 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.985340 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.985722 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.985990 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.987695 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.991118 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 13 10:06:36 crc kubenswrapper[4841]: I0313 10:06:36.992393 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.150114 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.150180 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4bb62be2-5fc7-4365-994e-cefee90fa78b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.150233 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.150278 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.150329 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4bb62be2-5fc7-4365-994e-cefee90fa78b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.150372 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.150428 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.150488 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.150527 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.150558 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.150602 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.150627 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-config\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.150657 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd47x\" (UniqueName: \"kubernetes.io/projected/4bb62be2-5fc7-4365-994e-cefee90fa78b-kube-api-access-cd47x\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.254029 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.254327 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.256406 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.256478 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.256564 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.256608 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-config\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.256665 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd47x\" (UniqueName: \"kubernetes.io/projected/4bb62be2-5fc7-4365-994e-cefee90fa78b-kube-api-access-cd47x\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.256730 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.256763 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.256819 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4bb62be2-5fc7-4365-994e-cefee90fa78b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.256908 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.256960 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.257044 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4bb62be2-5fc7-4365-994e-cefee90fa78b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.257105 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.257129 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.257557 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.257832 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.276310 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.278302 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4bb62be2-5fc7-4365-994e-cefee90fa78b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.280104 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.282571 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd47x\" (UniqueName: \"kubernetes.io/projected/4bb62be2-5fc7-4365-994e-cefee90fa78b-kube-api-access-cd47x\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.290432 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-config\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.290471 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.291544 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.293193 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.305066 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4bb62be2-5fc7-4365-994e-cefee90fa78b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.350988 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 10:06:37 crc kubenswrapper[4841]: I0313 10:06:37.886422 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:06:38 crc kubenswrapper[4841]: I0313 10:06:38.010616 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5" path="/var/lib/kubelet/pods/c45ed6af-1f3e-4cc9-b0e3-522ca2e2f6f5/volumes" Mar 13 10:06:38 crc kubenswrapper[4841]: I0313 10:06:38.139119 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4bb62be2-5fc7-4365-994e-cefee90fa78b","Type":"ContainerStarted","Data":"295214e7d8b983a391027ee1814c2242fed9730f3a6e4f358b3042d25bb39604"} Mar 13 10:06:39 crc kubenswrapper[4841]: I0313 10:06:39.994696 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:06:41 crc kubenswrapper[4841]: I0313 10:06:41.166894 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"345fb0919d4b00a63b8564b52ef5b612a0955f49c27fe5b9f30c8993e805394a"} Mar 13 10:06:42 crc kubenswrapper[4841]: I0313 10:06:42.176600 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4bb62be2-5fc7-4365-994e-cefee90fa78b","Type":"ContainerStarted","Data":"b746948972008c493c8635c3770b3b5d652a4235ce4e2137314ccd746d6095c5"} Mar 13 10:06:49 crc kubenswrapper[4841]: I0313 10:06:49.245257 4841 generic.go:334] "Generic (PLEG): container finished" podID="4bb62be2-5fc7-4365-994e-cefee90fa78b" containerID="b746948972008c493c8635c3770b3b5d652a4235ce4e2137314ccd746d6095c5" exitCode=0 Mar 13 10:06:49 crc kubenswrapper[4841]: I0313 10:06:49.245334 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4bb62be2-5fc7-4365-994e-cefee90fa78b","Type":"ContainerDied","Data":"b746948972008c493c8635c3770b3b5d652a4235ce4e2137314ccd746d6095c5"} Mar 13 10:06:50 crc kubenswrapper[4841]: I0313 10:06:50.256860 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4bb62be2-5fc7-4365-994e-cefee90fa78b","Type":"ContainerStarted","Data":"1b5b6627f7b5f9de27b6cec1d1ebcd390db0c381a412633050da7e589eb933eb"} Mar 13 10:06:54 crc kubenswrapper[4841]: I0313 10:06:54.299156 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4bb62be2-5fc7-4365-994e-cefee90fa78b","Type":"ContainerStarted","Data":"15879c33b73fb5f09b91bb21a796081b5983bb80aea32f63895794657d1c9c8b"} Mar 13 10:06:54 crc kubenswrapper[4841]: I0313 10:06:54.299728 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4bb62be2-5fc7-4365-994e-cefee90fa78b","Type":"ContainerStarted","Data":"adbf6eed83415549c2a34785e20af0bb6a61957d22afa76e5b12e37d9cd388f1"} Mar 13 10:06:54 crc kubenswrapper[4841]: I0313 10:06:54.324413 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.324386781 podStartE2EDuration="18.324386781s" podCreationTimestamp="2026-03-13 10:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:06:54.323769831 +0000 UTC m=+3297.053670072" watchObservedRunningTime="2026-03-13 10:06:54.324386781 +0000 UTC m=+3297.054287002" Mar 13 10:06:57 crc kubenswrapper[4841]: I0313 10:06:57.351727 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 13 10:07:03 crc kubenswrapper[4841]: I0313 10:07:03.054512 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bxqd7"] Mar 13 10:07:03 crc kubenswrapper[4841]: I0313 10:07:03.057083 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxqd7" Mar 13 10:07:03 crc kubenswrapper[4841]: I0313 10:07:03.073015 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bxqd7"] Mar 13 10:07:03 crc kubenswrapper[4841]: I0313 10:07:03.197579 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9961df-42f4-476e-bea3-ab77215f269c-utilities\") pod \"community-operators-bxqd7\" (UID: \"af9961df-42f4-476e-bea3-ab77215f269c\") " pod="openshift-marketplace/community-operators-bxqd7" Mar 13 10:07:03 crc kubenswrapper[4841]: I0313 10:07:03.197812 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvqxf\" (UniqueName: \"kubernetes.io/projected/af9961df-42f4-476e-bea3-ab77215f269c-kube-api-access-fvqxf\") pod \"community-operators-bxqd7\" (UID: \"af9961df-42f4-476e-bea3-ab77215f269c\") " pod="openshift-marketplace/community-operators-bxqd7" Mar 13 10:07:03 crc kubenswrapper[4841]: I0313 10:07:03.197874 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9961df-42f4-476e-bea3-ab77215f269c-catalog-content\") pod \"community-operators-bxqd7\" (UID: \"af9961df-42f4-476e-bea3-ab77215f269c\") " pod="openshift-marketplace/community-operators-bxqd7" Mar 13 10:07:03 crc kubenswrapper[4841]: I0313 10:07:03.299595 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9961df-42f4-476e-bea3-ab77215f269c-utilities\") pod \"community-operators-bxqd7\" (UID: \"af9961df-42f4-476e-bea3-ab77215f269c\") " pod="openshift-marketplace/community-operators-bxqd7" Mar 13 10:07:03 crc kubenswrapper[4841]: I0313 10:07:03.299696 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvqxf\" (UniqueName: \"kubernetes.io/projected/af9961df-42f4-476e-bea3-ab77215f269c-kube-api-access-fvqxf\") pod \"community-operators-bxqd7\" (UID: \"af9961df-42f4-476e-bea3-ab77215f269c\") " pod="openshift-marketplace/community-operators-bxqd7" Mar 13 10:07:03 crc kubenswrapper[4841]: I0313 10:07:03.299719 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9961df-42f4-476e-bea3-ab77215f269c-catalog-content\") pod \"community-operators-bxqd7\" (UID: \"af9961df-42f4-476e-bea3-ab77215f269c\") " pod="openshift-marketplace/community-operators-bxqd7" Mar 13 10:07:03 crc kubenswrapper[4841]: I0313 10:07:03.300360 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9961df-42f4-476e-bea3-ab77215f269c-catalog-content\") pod \"community-operators-bxqd7\" (UID: \"af9961df-42f4-476e-bea3-ab77215f269c\") " pod="openshift-marketplace/community-operators-bxqd7" Mar 13 10:07:03 crc kubenswrapper[4841]: I0313 10:07:03.300485 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9961df-42f4-476e-bea3-ab77215f269c-utilities\") pod \"community-operators-bxqd7\" (UID: \"af9961df-42f4-476e-bea3-ab77215f269c\") " pod="openshift-marketplace/community-operators-bxqd7" Mar 13 10:07:03 crc kubenswrapper[4841]: I0313 10:07:03.320809 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvqxf\" (UniqueName: \"kubernetes.io/projected/af9961df-42f4-476e-bea3-ab77215f269c-kube-api-access-fvqxf\") pod \"community-operators-bxqd7\" (UID: \"af9961df-42f4-476e-bea3-ab77215f269c\") " pod="openshift-marketplace/community-operators-bxqd7" Mar 13 10:07:03 crc kubenswrapper[4841]: I0313 10:07:03.401144 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxqd7" Mar 13 10:07:03 crc kubenswrapper[4841]: I0313 10:07:03.954856 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bxqd7"] Mar 13 10:07:04 crc kubenswrapper[4841]: I0313 10:07:04.052757 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxqd7" event={"ID":"af9961df-42f4-476e-bea3-ab77215f269c","Type":"ContainerStarted","Data":"a96ba126c9a9a48ed4efe8b31334e82a2e908b10ba4702400c89f07c5c49df1c"} Mar 13 10:07:05 crc kubenswrapper[4841]: I0313 10:07:05.064547 4841 generic.go:334] "Generic (PLEG): container finished" podID="af9961df-42f4-476e-bea3-ab77215f269c" containerID="c3be6d8e553fc4085bb986dd385da0875e7c438fc3367e0bc27768e229688795" exitCode=0 Mar 13 10:07:05 crc kubenswrapper[4841]: I0313 10:07:05.064617 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxqd7" event={"ID":"af9961df-42f4-476e-bea3-ab77215f269c","Type":"ContainerDied","Data":"c3be6d8e553fc4085bb986dd385da0875e7c438fc3367e0bc27768e229688795"} Mar 13 10:07:05 crc kubenswrapper[4841]: I0313 10:07:05.070421 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 10:07:05 crc kubenswrapper[4841]: I0313 10:07:05.646178 4841 scope.go:117] "RemoveContainer" containerID="058d372d8e7682703b4bfb70c0ddb88de134c1d740462ca826ab1f9e6fbbc1ef" Mar 13 10:07:06 crc kubenswrapper[4841]: I0313 10:07:06.078146 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxqd7" event={"ID":"af9961df-42f4-476e-bea3-ab77215f269c","Type":"ContainerStarted","Data":"deb445e9a1522b8445ffe2a78658ed164d3a4935c68b5eb93e89eb2328c0bd52"} Mar 13 10:07:07 crc kubenswrapper[4841]: I0313 10:07:07.351606 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 13 10:07:07 crc kubenswrapper[4841]: I0313 10:07:07.357822 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 13 10:07:08 crc kubenswrapper[4841]: I0313 10:07:08.096998 4841 generic.go:334] "Generic (PLEG): container finished" podID="af9961df-42f4-476e-bea3-ab77215f269c" containerID="deb445e9a1522b8445ffe2a78658ed164d3a4935c68b5eb93e89eb2328c0bd52" exitCode=0 Mar 13 10:07:08 crc kubenswrapper[4841]: I0313 10:07:08.097078 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxqd7" event={"ID":"af9961df-42f4-476e-bea3-ab77215f269c","Type":"ContainerDied","Data":"deb445e9a1522b8445ffe2a78658ed164d3a4935c68b5eb93e89eb2328c0bd52"} Mar 13 10:07:08 crc kubenswrapper[4841]: I0313 10:07:08.103971 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 13 10:07:10 crc kubenswrapper[4841]: I0313 10:07:10.118822 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxqd7" event={"ID":"af9961df-42f4-476e-bea3-ab77215f269c","Type":"ContainerStarted","Data":"126d60a49aa568232dac2726e8c3edc334bda4780ceb9f299716f37d39f3fcbe"} Mar 13 10:07:10 crc kubenswrapper[4841]: I0313 10:07:10.142546 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bxqd7" podStartSLOduration=3.139631011 podStartE2EDuration="7.142523395s" podCreationTimestamp="2026-03-13 10:07:03 +0000 UTC" firstStartedPulling="2026-03-13 10:07:05.070132428 +0000 UTC m=+3307.800032619" lastFinishedPulling="2026-03-13 10:07:09.073024812 +0000 UTC m=+3311.802925003" observedRunningTime="2026-03-13 10:07:10.135715312 +0000 UTC m=+3312.865615513" watchObservedRunningTime="2026-03-13 10:07:10.142523395 +0000 UTC m=+3312.872423586" Mar 13 10:07:13 crc kubenswrapper[4841]: I0313 10:07:13.401886 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bxqd7" Mar 13 10:07:13 crc kubenswrapper[4841]: I0313 10:07:13.402675 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bxqd7" Mar 13 10:07:13 crc kubenswrapper[4841]: I0313 10:07:13.452212 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bxqd7" Mar 13 10:07:14 crc kubenswrapper[4841]: I0313 10:07:14.204473 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bxqd7" Mar 13 10:07:14 crc kubenswrapper[4841]: I0313 10:07:14.253449 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bxqd7"] Mar 13 10:07:16 crc kubenswrapper[4841]: I0313 10:07:16.182097 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bxqd7" podUID="af9961df-42f4-476e-bea3-ab77215f269c" containerName="registry-server" containerID="cri-o://126d60a49aa568232dac2726e8c3edc334bda4780ceb9f299716f37d39f3fcbe" gracePeriod=2 Mar 13 10:07:16 crc kubenswrapper[4841]: I0313 10:07:16.634627 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxqd7" Mar 13 10:07:16 crc kubenswrapper[4841]: I0313 10:07:16.806410 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9961df-42f4-476e-bea3-ab77215f269c-utilities\") pod \"af9961df-42f4-476e-bea3-ab77215f269c\" (UID: \"af9961df-42f4-476e-bea3-ab77215f269c\") " Mar 13 10:07:16 crc kubenswrapper[4841]: I0313 10:07:16.806531 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9961df-42f4-476e-bea3-ab77215f269c-catalog-content\") pod \"af9961df-42f4-476e-bea3-ab77215f269c\" (UID: \"af9961df-42f4-476e-bea3-ab77215f269c\") " Mar 13 10:07:16 crc kubenswrapper[4841]: I0313 10:07:16.806757 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvqxf\" (UniqueName: \"kubernetes.io/projected/af9961df-42f4-476e-bea3-ab77215f269c-kube-api-access-fvqxf\") pod \"af9961df-42f4-476e-bea3-ab77215f269c\" (UID: \"af9961df-42f4-476e-bea3-ab77215f269c\") " Mar 13 10:07:16 crc kubenswrapper[4841]: I0313 10:07:16.808713 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af9961df-42f4-476e-bea3-ab77215f269c-utilities" (OuterVolumeSpecName: "utilities") pod "af9961df-42f4-476e-bea3-ab77215f269c" (UID: "af9961df-42f4-476e-bea3-ab77215f269c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:07:16 crc kubenswrapper[4841]: I0313 10:07:16.814727 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af9961df-42f4-476e-bea3-ab77215f269c-kube-api-access-fvqxf" (OuterVolumeSpecName: "kube-api-access-fvqxf") pod "af9961df-42f4-476e-bea3-ab77215f269c" (UID: "af9961df-42f4-476e-bea3-ab77215f269c"). InnerVolumeSpecName "kube-api-access-fvqxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:07:16 crc kubenswrapper[4841]: I0313 10:07:16.863273 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af9961df-42f4-476e-bea3-ab77215f269c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af9961df-42f4-476e-bea3-ab77215f269c" (UID: "af9961df-42f4-476e-bea3-ab77215f269c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:07:16 crc kubenswrapper[4841]: I0313 10:07:16.909520 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvqxf\" (UniqueName: \"kubernetes.io/projected/af9961df-42f4-476e-bea3-ab77215f269c-kube-api-access-fvqxf\") on node \"crc\" DevicePath \"\"" Mar 13 10:07:16 crc kubenswrapper[4841]: I0313 10:07:16.909551 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9961df-42f4-476e-bea3-ab77215f269c-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 10:07:16 crc kubenswrapper[4841]: I0313 10:07:16.909562 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9961df-42f4-476e-bea3-ab77215f269c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.101522 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xkjq9"] Mar 13 10:07:17 crc kubenswrapper[4841]: E0313 10:07:17.104545 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af9961df-42f4-476e-bea3-ab77215f269c" containerName="extract-content" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.104580 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="af9961df-42f4-476e-bea3-ab77215f269c" containerName="extract-content" Mar 13 10:07:17 crc kubenswrapper[4841]: E0313 10:07:17.104612 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af9961df-42f4-476e-bea3-ab77215f269c" containerName="registry-server" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.104620 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="af9961df-42f4-476e-bea3-ab77215f269c" containerName="registry-server" Mar 13 10:07:17 crc kubenswrapper[4841]: E0313 10:07:17.104635 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af9961df-42f4-476e-bea3-ab77215f269c" containerName="extract-utilities" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.104641 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="af9961df-42f4-476e-bea3-ab77215f269c" containerName="extract-utilities" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.104884 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="af9961df-42f4-476e-bea3-ab77215f269c" containerName="registry-server" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.106884 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkjq9" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.115574 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xkjq9"] Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.154748 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/420d154d-4803-42af-b870-73876862577b-catalog-content\") pod \"redhat-operators-xkjq9\" (UID: \"420d154d-4803-42af-b870-73876862577b\") " pod="openshift-marketplace/redhat-operators-xkjq9" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.155182 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/420d154d-4803-42af-b870-73876862577b-utilities\") pod \"redhat-operators-xkjq9\" (UID: \"420d154d-4803-42af-b870-73876862577b\") " pod="openshift-marketplace/redhat-operators-xkjq9" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.155234 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsndx\" (UniqueName: \"kubernetes.io/projected/420d154d-4803-42af-b870-73876862577b-kube-api-access-gsndx\") pod \"redhat-operators-xkjq9\" (UID: \"420d154d-4803-42af-b870-73876862577b\") " pod="openshift-marketplace/redhat-operators-xkjq9" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.194322 4841 generic.go:334] "Generic (PLEG): container finished" podID="af9961df-42f4-476e-bea3-ab77215f269c" containerID="126d60a49aa568232dac2726e8c3edc334bda4780ceb9f299716f37d39f3fcbe" exitCode=0 Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.194377 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxqd7" event={"ID":"af9961df-42f4-476e-bea3-ab77215f269c","Type":"ContainerDied","Data":"126d60a49aa568232dac2726e8c3edc334bda4780ceb9f299716f37d39f3fcbe"} Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.194397 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxqd7" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.194412 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxqd7" event={"ID":"af9961df-42f4-476e-bea3-ab77215f269c","Type":"ContainerDied","Data":"a96ba126c9a9a48ed4efe8b31334e82a2e908b10ba4702400c89f07c5c49df1c"} Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.194431 4841 scope.go:117] "RemoveContainer" containerID="126d60a49aa568232dac2726e8c3edc334bda4780ceb9f299716f37d39f3fcbe" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.230929 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bxqd7"] Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.231096 4841 scope.go:117] "RemoveContainer" containerID="deb445e9a1522b8445ffe2a78658ed164d3a4935c68b5eb93e89eb2328c0bd52" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.244145 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bxqd7"] Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.257810 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/420d154d-4803-42af-b870-73876862577b-catalog-content\") pod \"redhat-operators-xkjq9\" (UID: \"420d154d-4803-42af-b870-73876862577b\") " pod="openshift-marketplace/redhat-operators-xkjq9" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.258035 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/420d154d-4803-42af-b870-73876862577b-utilities\") pod \"redhat-operators-xkjq9\" (UID: \"420d154d-4803-42af-b870-73876862577b\") " pod="openshift-marketplace/redhat-operators-xkjq9" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.258087 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsndx\" (UniqueName: \"kubernetes.io/projected/420d154d-4803-42af-b870-73876862577b-kube-api-access-gsndx\") pod \"redhat-operators-xkjq9\" (UID: \"420d154d-4803-42af-b870-73876862577b\") " pod="openshift-marketplace/redhat-operators-xkjq9" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.259277 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/420d154d-4803-42af-b870-73876862577b-catalog-content\") pod \"redhat-operators-xkjq9\" (UID: \"420d154d-4803-42af-b870-73876862577b\") " pod="openshift-marketplace/redhat-operators-xkjq9" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.259364 4841 scope.go:117] "RemoveContainer" containerID="c3be6d8e553fc4085bb986dd385da0875e7c438fc3367e0bc27768e229688795" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.259711 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/420d154d-4803-42af-b870-73876862577b-utilities\") pod \"redhat-operators-xkjq9\" (UID: \"420d154d-4803-42af-b870-73876862577b\") " pod="openshift-marketplace/redhat-operators-xkjq9" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.276203 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsndx\" (UniqueName: \"kubernetes.io/projected/420d154d-4803-42af-b870-73876862577b-kube-api-access-gsndx\") pod \"redhat-operators-xkjq9\" (UID: \"420d154d-4803-42af-b870-73876862577b\") " pod="openshift-marketplace/redhat-operators-xkjq9" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.298888 4841 scope.go:117] "RemoveContainer" containerID="126d60a49aa568232dac2726e8c3edc334bda4780ceb9f299716f37d39f3fcbe" Mar 13 10:07:17 crc kubenswrapper[4841]: E0313 10:07:17.299391 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"126d60a49aa568232dac2726e8c3edc334bda4780ceb9f299716f37d39f3fcbe\": container with ID starting with 126d60a49aa568232dac2726e8c3edc334bda4780ceb9f299716f37d39f3fcbe not found: ID does not exist" containerID="126d60a49aa568232dac2726e8c3edc334bda4780ceb9f299716f37d39f3fcbe" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.299420 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"126d60a49aa568232dac2726e8c3edc334bda4780ceb9f299716f37d39f3fcbe"} err="failed to get container status \"126d60a49aa568232dac2726e8c3edc334bda4780ceb9f299716f37d39f3fcbe\": rpc error: code = NotFound desc = could not find container \"126d60a49aa568232dac2726e8c3edc334bda4780ceb9f299716f37d39f3fcbe\": container with ID starting with 126d60a49aa568232dac2726e8c3edc334bda4780ceb9f299716f37d39f3fcbe not found: ID does not exist" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.299445 4841 scope.go:117] "RemoveContainer" containerID="deb445e9a1522b8445ffe2a78658ed164d3a4935c68b5eb93e89eb2328c0bd52" Mar 13 10:07:17 crc kubenswrapper[4841]: E0313 10:07:17.300022 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deb445e9a1522b8445ffe2a78658ed164d3a4935c68b5eb93e89eb2328c0bd52\": container with ID starting with deb445e9a1522b8445ffe2a78658ed164d3a4935c68b5eb93e89eb2328c0bd52 not found: ID does not exist" containerID="deb445e9a1522b8445ffe2a78658ed164d3a4935c68b5eb93e89eb2328c0bd52" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.300051 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb445e9a1522b8445ffe2a78658ed164d3a4935c68b5eb93e89eb2328c0bd52"} err="failed to get container status \"deb445e9a1522b8445ffe2a78658ed164d3a4935c68b5eb93e89eb2328c0bd52\": rpc error: code = NotFound desc = could not find container \"deb445e9a1522b8445ffe2a78658ed164d3a4935c68b5eb93e89eb2328c0bd52\": container with ID starting with deb445e9a1522b8445ffe2a78658ed164d3a4935c68b5eb93e89eb2328c0bd52 not found: ID does not exist" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.300065 4841 scope.go:117] "RemoveContainer" containerID="c3be6d8e553fc4085bb986dd385da0875e7c438fc3367e0bc27768e229688795" Mar 13 10:07:17 crc kubenswrapper[4841]: E0313 10:07:17.300421 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3be6d8e553fc4085bb986dd385da0875e7c438fc3367e0bc27768e229688795\": container with ID starting with c3be6d8e553fc4085bb986dd385da0875e7c438fc3367e0bc27768e229688795 not found: ID does not exist" containerID="c3be6d8e553fc4085bb986dd385da0875e7c438fc3367e0bc27768e229688795" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.300460 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3be6d8e553fc4085bb986dd385da0875e7c438fc3367e0bc27768e229688795"} err="failed to get container status \"c3be6d8e553fc4085bb986dd385da0875e7c438fc3367e0bc27768e229688795\": rpc error: code = NotFound desc = could not find container \"c3be6d8e553fc4085bb986dd385da0875e7c438fc3367e0bc27768e229688795\": container with ID starting with c3be6d8e553fc4085bb986dd385da0875e7c438fc3367e0bc27768e229688795 not found: ID does not exist" Mar 13 10:07:17 crc kubenswrapper[4841]: I0313 10:07:17.482708 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkjq9" Mar 13 10:07:18 crc kubenswrapper[4841]: I0313 10:07:18.008093 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af9961df-42f4-476e-bea3-ab77215f269c" path="/var/lib/kubelet/pods/af9961df-42f4-476e-bea3-ab77215f269c/volumes" Mar 13 10:07:18 crc kubenswrapper[4841]: I0313 10:07:18.009646 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xkjq9"] Mar 13 10:07:18 crc kubenswrapper[4841]: I0313 10:07:18.222396 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkjq9" event={"ID":"420d154d-4803-42af-b870-73876862577b","Type":"ContainerStarted","Data":"49bf3b076f67c0103bd28bc9b7ca93e58377b2e1f87c4481dbb549e9045c69a8"} Mar 13 10:07:19 crc kubenswrapper[4841]: I0313 10:07:19.239422 4841 generic.go:334] "Generic (PLEG): container finished" podID="420d154d-4803-42af-b870-73876862577b" containerID="d928db89c9c711f175fed25cc97aa9290582c13b10947a785ec3b0e975c4de90" exitCode=0 Mar 13 10:07:19 crc kubenswrapper[4841]: I0313 10:07:19.239538 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkjq9" event={"ID":"420d154d-4803-42af-b870-73876862577b","Type":"ContainerDied","Data":"d928db89c9c711f175fed25cc97aa9290582c13b10947a785ec3b0e975c4de90"} Mar 13 10:07:20 crc kubenswrapper[4841]: I0313 10:07:20.250134 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkjq9" event={"ID":"420d154d-4803-42af-b870-73876862577b","Type":"ContainerStarted","Data":"bc0652b78bcaae433cd87f384fc8e695edd1fe059dcf6290c18eabca767a6804"} Mar 13 10:07:25 crc kubenswrapper[4841]: I0313 10:07:25.307530 4841 generic.go:334] "Generic (PLEG): container finished" podID="420d154d-4803-42af-b870-73876862577b" containerID="bc0652b78bcaae433cd87f384fc8e695edd1fe059dcf6290c18eabca767a6804" exitCode=0 Mar 13 10:07:25 crc kubenswrapper[4841]: I0313 10:07:25.307597 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkjq9" event={"ID":"420d154d-4803-42af-b870-73876862577b","Type":"ContainerDied","Data":"bc0652b78bcaae433cd87f384fc8e695edd1fe059dcf6290c18eabca767a6804"} Mar 13 10:07:26 crc kubenswrapper[4841]: I0313 10:07:26.319297 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkjq9" event={"ID":"420d154d-4803-42af-b870-73876862577b","Type":"ContainerStarted","Data":"dce53ad244b4d57a2a4267f3ff64d3bfe6da32dda58c531da363ab9d88d79faa"} Mar 13 10:07:26 crc kubenswrapper[4841]: I0313 10:07:26.337296 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xkjq9" podStartSLOduration=2.859928212 podStartE2EDuration="9.337252208s" podCreationTimestamp="2026-03-13 10:07:17 +0000 UTC" firstStartedPulling="2026-03-13 10:07:19.242201891 +0000 UTC m=+3321.972102102" lastFinishedPulling="2026-03-13 10:07:25.719525897 +0000 UTC m=+3328.449426098" observedRunningTime="2026-03-13 10:07:26.334666016 +0000 UTC m=+3329.064566207" watchObservedRunningTime="2026-03-13 10:07:26.337252208 +0000 UTC m=+3329.067152399" Mar 13 10:07:27 crc kubenswrapper[4841]: I0313 10:07:27.482871 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xkjq9" Mar 13 10:07:27 crc kubenswrapper[4841]: I0313 10:07:27.483223 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xkjq9" Mar 13 10:07:28 crc kubenswrapper[4841]: I0313 10:07:28.525590 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xkjq9" podUID="420d154d-4803-42af-b870-73876862577b" containerName="registry-server" probeResult="failure" output=< Mar 13 10:07:28 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Mar 13 10:07:28 crc kubenswrapper[4841]: > Mar 13 10:07:37 crc kubenswrapper[4841]: I0313 10:07:37.542824 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xkjq9" Mar 13 10:07:37 crc kubenswrapper[4841]: I0313 10:07:37.598102 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xkjq9" Mar 13 10:07:37 crc kubenswrapper[4841]: I0313 10:07:37.783772 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xkjq9"] Mar 13 10:07:39 crc kubenswrapper[4841]: I0313 10:07:39.446365 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xkjq9" podUID="420d154d-4803-42af-b870-73876862577b" containerName="registry-server" containerID="cri-o://dce53ad244b4d57a2a4267f3ff64d3bfe6da32dda58c531da363ab9d88d79faa" gracePeriod=2 Mar 13 10:07:39 crc kubenswrapper[4841]: I0313 10:07:39.929646 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkjq9" Mar 13 10:07:39 crc kubenswrapper[4841]: I0313 10:07:39.964243 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsndx\" (UniqueName: \"kubernetes.io/projected/420d154d-4803-42af-b870-73876862577b-kube-api-access-gsndx\") pod \"420d154d-4803-42af-b870-73876862577b\" (UID: \"420d154d-4803-42af-b870-73876862577b\") " Mar 13 10:07:39 crc kubenswrapper[4841]: I0313 10:07:39.964747 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/420d154d-4803-42af-b870-73876862577b-utilities\") pod \"420d154d-4803-42af-b870-73876862577b\" (UID: \"420d154d-4803-42af-b870-73876862577b\") " Mar 13 10:07:39 crc kubenswrapper[4841]: I0313 10:07:39.965063 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/420d154d-4803-42af-b870-73876862577b-catalog-content\") pod \"420d154d-4803-42af-b870-73876862577b\" (UID: \"420d154d-4803-42af-b870-73876862577b\") " Mar 13 10:07:39 crc kubenswrapper[4841]: I0313 10:07:39.965762 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/420d154d-4803-42af-b870-73876862577b-utilities" (OuterVolumeSpecName: "utilities") pod "420d154d-4803-42af-b870-73876862577b" (UID: "420d154d-4803-42af-b870-73876862577b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:07:39 crc kubenswrapper[4841]: I0313 10:07:39.970593 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/420d154d-4803-42af-b870-73876862577b-kube-api-access-gsndx" (OuterVolumeSpecName: "kube-api-access-gsndx") pod "420d154d-4803-42af-b870-73876862577b" (UID: "420d154d-4803-42af-b870-73876862577b"). InnerVolumeSpecName "kube-api-access-gsndx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.067541 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsndx\" (UniqueName: \"kubernetes.io/projected/420d154d-4803-42af-b870-73876862577b-kube-api-access-gsndx\") on node \"crc\" DevicePath \"\"" Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.067575 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/420d154d-4803-42af-b870-73876862577b-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.107006 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/420d154d-4803-42af-b870-73876862577b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "420d154d-4803-42af-b870-73876862577b" (UID: "420d154d-4803-42af-b870-73876862577b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.169711 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/420d154d-4803-42af-b870-73876862577b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.459423 4841 generic.go:334] "Generic (PLEG): container finished" podID="420d154d-4803-42af-b870-73876862577b" containerID="dce53ad244b4d57a2a4267f3ff64d3bfe6da32dda58c531da363ab9d88d79faa" exitCode=0 Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.459504 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkjq9" event={"ID":"420d154d-4803-42af-b870-73876862577b","Type":"ContainerDied","Data":"dce53ad244b4d57a2a4267f3ff64d3bfe6da32dda58c531da363ab9d88d79faa"} Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.459616 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkjq9" event={"ID":"420d154d-4803-42af-b870-73876862577b","Type":"ContainerDied","Data":"49bf3b076f67c0103bd28bc9b7ca93e58377b2e1f87c4481dbb549e9045c69a8"} Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.459652 4841 scope.go:117] "RemoveContainer" containerID="dce53ad244b4d57a2a4267f3ff64d3bfe6da32dda58c531da363ab9d88d79faa" Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.460421 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkjq9" Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.490069 4841 scope.go:117] "RemoveContainer" containerID="bc0652b78bcaae433cd87f384fc8e695edd1fe059dcf6290c18eabca767a6804" Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.509335 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xkjq9"] Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.521924 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xkjq9"] Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.524495 4841 scope.go:117] "RemoveContainer" containerID="d928db89c9c711f175fed25cc97aa9290582c13b10947a785ec3b0e975c4de90" Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.561077 4841 scope.go:117] "RemoveContainer" containerID="dce53ad244b4d57a2a4267f3ff64d3bfe6da32dda58c531da363ab9d88d79faa" Mar 13 10:07:40 crc kubenswrapper[4841]: E0313 10:07:40.561706 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dce53ad244b4d57a2a4267f3ff64d3bfe6da32dda58c531da363ab9d88d79faa\": container with ID starting with dce53ad244b4d57a2a4267f3ff64d3bfe6da32dda58c531da363ab9d88d79faa not found: ID does not exist" containerID="dce53ad244b4d57a2a4267f3ff64d3bfe6da32dda58c531da363ab9d88d79faa" Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.561756 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce53ad244b4d57a2a4267f3ff64d3bfe6da32dda58c531da363ab9d88d79faa"} err="failed to get container status \"dce53ad244b4d57a2a4267f3ff64d3bfe6da32dda58c531da363ab9d88d79faa\": rpc error: code = NotFound desc = could not find container \"dce53ad244b4d57a2a4267f3ff64d3bfe6da32dda58c531da363ab9d88d79faa\": container with ID starting with dce53ad244b4d57a2a4267f3ff64d3bfe6da32dda58c531da363ab9d88d79faa not found: ID does not exist" Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.561787 4841 scope.go:117] "RemoveContainer" containerID="bc0652b78bcaae433cd87f384fc8e695edd1fe059dcf6290c18eabca767a6804" Mar 13 10:07:40 crc kubenswrapper[4841]: E0313 10:07:40.562085 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc0652b78bcaae433cd87f384fc8e695edd1fe059dcf6290c18eabca767a6804\": container with ID starting with bc0652b78bcaae433cd87f384fc8e695edd1fe059dcf6290c18eabca767a6804 not found: ID does not exist" containerID="bc0652b78bcaae433cd87f384fc8e695edd1fe059dcf6290c18eabca767a6804" Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.562127 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc0652b78bcaae433cd87f384fc8e695edd1fe059dcf6290c18eabca767a6804"} err="failed to get container status \"bc0652b78bcaae433cd87f384fc8e695edd1fe059dcf6290c18eabca767a6804\": rpc error: code = NotFound desc = could not find container \"bc0652b78bcaae433cd87f384fc8e695edd1fe059dcf6290c18eabca767a6804\": container with ID starting with bc0652b78bcaae433cd87f384fc8e695edd1fe059dcf6290c18eabca767a6804 not found: ID does not exist" Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.562156 4841 scope.go:117] "RemoveContainer" containerID="d928db89c9c711f175fed25cc97aa9290582c13b10947a785ec3b0e975c4de90" Mar 13 10:07:40 crc kubenswrapper[4841]: E0313 10:07:40.562644 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d928db89c9c711f175fed25cc97aa9290582c13b10947a785ec3b0e975c4de90\": container with ID starting with d928db89c9c711f175fed25cc97aa9290582c13b10947a785ec3b0e975c4de90 not found: ID does not exist" containerID="d928db89c9c711f175fed25cc97aa9290582c13b10947a785ec3b0e975c4de90" Mar 13 10:07:40 crc kubenswrapper[4841]: I0313 10:07:40.562677 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d928db89c9c711f175fed25cc97aa9290582c13b10947a785ec3b0e975c4de90"} err="failed to get container status \"d928db89c9c711f175fed25cc97aa9290582c13b10947a785ec3b0e975c4de90\": rpc error: code = NotFound desc = could not find container \"d928db89c9c711f175fed25cc97aa9290582c13b10947a785ec3b0e975c4de90\": container with ID starting with d928db89c9c711f175fed25cc97aa9290582c13b10947a785ec3b0e975c4de90 not found: ID does not exist" Mar 13 10:07:42 crc kubenswrapper[4841]: I0313 10:07:42.006010 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="420d154d-4803-42af-b870-73876862577b" path="/var/lib/kubelet/pods/420d154d-4803-42af-b870-73876862577b/volumes" Mar 13 10:08:00 crc kubenswrapper[4841]: I0313 10:08:00.242578 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556608-rws5k"] Mar 13 10:08:00 crc kubenswrapper[4841]: E0313 10:08:00.243701 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="420d154d-4803-42af-b870-73876862577b" containerName="registry-server" Mar 13 10:08:00 crc kubenswrapper[4841]: I0313 10:08:00.243722 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="420d154d-4803-42af-b870-73876862577b" containerName="registry-server" Mar 13 10:08:00 crc kubenswrapper[4841]: E0313 10:08:00.243734 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="420d154d-4803-42af-b870-73876862577b" containerName="extract-content" Mar 13 10:08:00 crc kubenswrapper[4841]: I0313 10:08:00.243740 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="420d154d-4803-42af-b870-73876862577b" containerName="extract-content" Mar 13 10:08:00 crc kubenswrapper[4841]: E0313 10:08:00.243750 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="420d154d-4803-42af-b870-73876862577b" containerName="extract-utilities" Mar 13 10:08:00 crc kubenswrapper[4841]: I0313 10:08:00.243757 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="420d154d-4803-42af-b870-73876862577b" containerName="extract-utilities" Mar 13 10:08:00 crc kubenswrapper[4841]: I0313 10:08:00.244123 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="420d154d-4803-42af-b870-73876862577b" containerName="registry-server" Mar 13 10:08:00 crc kubenswrapper[4841]: I0313 10:08:00.244791 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556608-rws5k" Mar 13 10:08:00 crc kubenswrapper[4841]: I0313 10:08:00.247754 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 10:08:00 crc kubenswrapper[4841]: I0313 10:08:00.247809 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 10:08:00 crc kubenswrapper[4841]: I0313 10:08:00.247877 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 10:08:00 crc kubenswrapper[4841]: I0313 10:08:00.255210 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556608-rws5k"] Mar 13 10:08:00 crc kubenswrapper[4841]: I0313 10:08:00.386360 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff7rl\" (UniqueName: \"kubernetes.io/projected/7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c-kube-api-access-ff7rl\") pod \"auto-csr-approver-29556608-rws5k\" (UID: \"7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c\") " pod="openshift-infra/auto-csr-approver-29556608-rws5k" Mar 13 10:08:00 crc kubenswrapper[4841]: I0313 10:08:00.488822 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff7rl\" (UniqueName: \"kubernetes.io/projected/7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c-kube-api-access-ff7rl\") pod \"auto-csr-approver-29556608-rws5k\" (UID: \"7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c\") " pod="openshift-infra/auto-csr-approver-29556608-rws5k" Mar 13 10:08:00 crc kubenswrapper[4841]: I0313 10:08:00.511947 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff7rl\" (UniqueName: \"kubernetes.io/projected/7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c-kube-api-access-ff7rl\") pod \"auto-csr-approver-29556608-rws5k\" (UID: \"7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c\") " pod="openshift-infra/auto-csr-approver-29556608-rws5k" Mar 13 10:08:00 crc kubenswrapper[4841]: I0313 10:08:00.567568 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556608-rws5k" Mar 13 10:08:01 crc kubenswrapper[4841]: I0313 10:08:01.014995 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556608-rws5k"] Mar 13 10:08:01 crc kubenswrapper[4841]: I0313 10:08:01.700115 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556608-rws5k" event={"ID":"7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c","Type":"ContainerStarted","Data":"006b56de3f8e0f98268f8a1443693247bc37c558dae8e738a5781b21e3d0b9f4"} Mar 13 10:08:02 crc kubenswrapper[4841]: I0313 10:08:02.709252 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556608-rws5k" event={"ID":"7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c","Type":"ContainerStarted","Data":"d8a5d5c4f5531bd892325b3e9e4b7df0ac10af9f8c84faa6cd98c52191b75425"} Mar 13 10:08:02 crc kubenswrapper[4841]: I0313 10:08:02.732566 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556608-rws5k" podStartSLOduration=1.507191568 podStartE2EDuration="2.732547112s" podCreationTimestamp="2026-03-13 10:08:00 +0000 UTC" firstStartedPulling="2026-03-13 10:08:01.024963208 +0000 UTC m=+3363.754863399" lastFinishedPulling="2026-03-13 10:08:02.250318752 +0000 UTC m=+3364.980218943" observedRunningTime="2026-03-13 10:08:02.731409927 +0000 UTC m=+3365.461310118" watchObservedRunningTime="2026-03-13 10:08:02.732547112 +0000 UTC m=+3365.462447303" Mar 13 10:08:03 crc kubenswrapper[4841]: I0313 10:08:03.722853 4841 generic.go:334] "Generic (PLEG): container finished" podID="7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c" containerID="d8a5d5c4f5531bd892325b3e9e4b7df0ac10af9f8c84faa6cd98c52191b75425" exitCode=0 Mar 13 10:08:03 crc kubenswrapper[4841]: I0313 10:08:03.722942 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556608-rws5k" event={"ID":"7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c","Type":"ContainerDied","Data":"d8a5d5c4f5531bd892325b3e9e4b7df0ac10af9f8c84faa6cd98c52191b75425"} Mar 13 10:08:05 crc kubenswrapper[4841]: I0313 10:08:05.113146 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556608-rws5k" Mar 13 10:08:05 crc kubenswrapper[4841]: I0313 10:08:05.280385 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff7rl\" (UniqueName: \"kubernetes.io/projected/7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c-kube-api-access-ff7rl\") pod \"7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c\" (UID: \"7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c\") " Mar 13 10:08:05 crc kubenswrapper[4841]: I0313 10:08:05.292585 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c-kube-api-access-ff7rl" (OuterVolumeSpecName: "kube-api-access-ff7rl") pod "7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c" (UID: "7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c"). InnerVolumeSpecName "kube-api-access-ff7rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:08:05 crc kubenswrapper[4841]: I0313 10:08:05.383573 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff7rl\" (UniqueName: \"kubernetes.io/projected/7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c-kube-api-access-ff7rl\") on node \"crc\" DevicePath \"\"" Mar 13 10:08:05 crc kubenswrapper[4841]: I0313 10:08:05.743373 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556608-rws5k" event={"ID":"7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c","Type":"ContainerDied","Data":"006b56de3f8e0f98268f8a1443693247bc37c558dae8e738a5781b21e3d0b9f4"} Mar 13 10:08:05 crc kubenswrapper[4841]: I0313 10:08:05.743421 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="006b56de3f8e0f98268f8a1443693247bc37c558dae8e738a5781b21e3d0b9f4" Mar 13 10:08:05 crc kubenswrapper[4841]: I0313 10:08:05.743437 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556608-rws5k" Mar 13 10:08:05 crc kubenswrapper[4841]: I0313 10:08:05.813562 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556602-n6zgh"] Mar 13 10:08:05 crc kubenswrapper[4841]: I0313 10:08:05.829566 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556602-n6zgh"] Mar 13 10:08:06 crc kubenswrapper[4841]: I0313 10:08:06.015818 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bdbe8c3-132b-4800-b0ce-f9ef46c6deee" path="/var/lib/kubelet/pods/7bdbe8c3-132b-4800-b0ce-f9ef46c6deee/volumes" Mar 13 10:08:34 crc kubenswrapper[4841]: I0313 10:08:34.524237 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-8485bdb9db-mf5lp_17824e5f-18b3-46c0-910a-56e5529e09c3/manager/0.log" Mar 13 10:08:35 crc kubenswrapper[4841]: I0313 10:08:35.740787 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 13 10:08:35 crc kubenswrapper[4841]: I0313 10:08:35.741117 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerName="aodh-api" containerID="cri-o://36ad1ad9997573d72b2f9127d3165d7e2b06e4a6c4331c8d606d1b320fab6f13" gracePeriod=30 Mar 13 10:08:35 crc kubenswrapper[4841]: I0313 10:08:35.741250 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerName="aodh-listener" containerID="cri-o://962a89540a477c8177f10d4e2937b469979eb748031b076446c1ab4689c0b767" gracePeriod=30 Mar 13 10:08:35 crc kubenswrapper[4841]: I0313 10:08:35.741318 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerName="aodh-notifier" containerID="cri-o://3af4ec93db84e46d7004fd103f200655792e0f4840a06679d756f0a4c974cb1f" gracePeriod=30 Mar 13 10:08:35 crc kubenswrapper[4841]: I0313 10:08:35.741351 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerName="aodh-evaluator" containerID="cri-o://3b2338cfdfe4d9e5129e8ada951e6926e25db525ffc008da8d01e0d424403d06" gracePeriod=30 Mar 13 10:08:36 crc kubenswrapper[4841]: I0313 10:08:36.043990 4841 generic.go:334] "Generic (PLEG): container finished" podID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerID="36ad1ad9997573d72b2f9127d3165d7e2b06e4a6c4331c8d606d1b320fab6f13" exitCode=0 Mar 13 10:08:36 crc kubenswrapper[4841]: I0313 10:08:36.044079 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7d543631-bf8a-46c4-8808-ed0e3c563189","Type":"ContainerDied","Data":"36ad1ad9997573d72b2f9127d3165d7e2b06e4a6c4331c8d606d1b320fab6f13"} Mar 13 10:08:37 crc kubenswrapper[4841]: I0313 10:08:37.058241 4841 generic.go:334] "Generic (PLEG): container finished" podID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerID="3b2338cfdfe4d9e5129e8ada951e6926e25db525ffc008da8d01e0d424403d06" exitCode=0 Mar 13 10:08:37 crc kubenswrapper[4841]: I0313 10:08:37.058312 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7d543631-bf8a-46c4-8808-ed0e3c563189","Type":"ContainerDied","Data":"3b2338cfdfe4d9e5129e8ada951e6926e25db525ffc008da8d01e0d424403d06"} Mar 13 10:08:39 crc kubenswrapper[4841]: I0313 10:08:39.084391 4841 generic.go:334] "Generic (PLEG): container finished" podID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerID="962a89540a477c8177f10d4e2937b469979eb748031b076446c1ab4689c0b767" exitCode=0 Mar 13 10:08:39 crc kubenswrapper[4841]: I0313 10:08:39.084443 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7d543631-bf8a-46c4-8808-ed0e3c563189","Type":"ContainerDied","Data":"962a89540a477c8177f10d4e2937b469979eb748031b076446c1ab4689c0b767"} Mar 13 10:08:40 crc kubenswrapper[4841]: I0313 10:08:40.042706 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-7ae0-account-create-update-qtssk"] Mar 13 10:08:40 crc kubenswrapper[4841]: I0313 10:08:40.053238 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-tkl8g"] Mar 13 10:08:40 crc kubenswrapper[4841]: I0313 10:08:40.065138 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-tkl8g"] Mar 13 10:08:40 crc kubenswrapper[4841]: I0313 10:08:40.076136 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-7ae0-account-create-update-qtssk"] Mar 13 10:08:42 crc kubenswrapper[4841]: I0313 10:08:42.005933 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f" path="/var/lib/kubelet/pods/5cc0e5cc-7ba7-4aa7-baa6-d2d9d6b4342f/volumes" Mar 13 10:08:42 crc kubenswrapper[4841]: I0313 10:08:42.007509 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd7e92b5-60a2-4b89-94c5-97003da1aefe" path="/var/lib/kubelet/pods/cd7e92b5-60a2-4b89-94c5-97003da1aefe/volumes" Mar 13 10:08:51 crc kubenswrapper[4841]: I0313 10:08:51.035283 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-fpfgj"] Mar 13 10:08:51 crc kubenswrapper[4841]: I0313 10:08:51.046540 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-fpfgj"] Mar 13 10:08:52 crc kubenswrapper[4841]: I0313 10:08:52.004244 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e882fdd9-3b5a-4835-9a63-239f15ce9ea1" path="/var/lib/kubelet/pods/e882fdd9-3b5a-4835-9a63-239f15ce9ea1/volumes" Mar 13 10:09:04 crc kubenswrapper[4841]: I0313 10:09:04.407250 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 10:09:04 crc kubenswrapper[4841]: I0313 10:09:04.407863 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 10:09:05 crc kubenswrapper[4841]: I0313 10:09:05.884444 4841 scope.go:117] "RemoveContainer" containerID="08e94ccf8739c2711f5a40db95804705fd6bd2b3b9c7f0516f80f7dc287bdaec" Mar 13 10:09:05 crc kubenswrapper[4841]: I0313 10:09:05.950751 4841 scope.go:117] "RemoveContainer" containerID="c2166581ac140f285937a0942049fbac994c9823d0c6587762c03e29320709a0" Mar 13 10:09:05 crc kubenswrapper[4841]: I0313 10:09:05.990861 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w8mtm"] Mar 13 10:09:05 crc kubenswrapper[4841]: E0313 10:09:05.991372 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c" containerName="oc" Mar 13 10:09:05 crc kubenswrapper[4841]: I0313 10:09:05.991389 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c" containerName="oc" Mar 13 10:09:05 crc kubenswrapper[4841]: I0313 10:09:05.991675 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c" containerName="oc" Mar 13 10:09:05 crc kubenswrapper[4841]: I0313 10:09:05.993310 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w8mtm" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.055947 4841 scope.go:117] "RemoveContainer" containerID="b1fb7408425da464e82cd6947cedcefc62f27744b8c3a9c8f1f751e492a82024" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.074366 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w8mtm"] Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.089969 4841 generic.go:334] "Generic (PLEG): container finished" podID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerID="3af4ec93db84e46d7004fd103f200655792e0f4840a06679d756f0a4c974cb1f" exitCode=137 Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.090091 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7d543631-bf8a-46c4-8808-ed0e3c563189","Type":"ContainerDied","Data":"3af4ec93db84e46d7004fd103f200655792e0f4840a06679d756f0a4c974cb1f"} Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.137462 4841 scope.go:117] "RemoveContainer" containerID="ae8ba7502a6814dfa6ebe4b4581589e13e4de42a7a2f6eb987cbe547da6f614f" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.140636 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f25e9ca9-3a26-4c68-84d7-e91d07e5c478-utilities\") pod \"certified-operators-w8mtm\" (UID: \"f25e9ca9-3a26-4c68-84d7-e91d07e5c478\") " pod="openshift-marketplace/certified-operators-w8mtm" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.140684 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f25e9ca9-3a26-4c68-84d7-e91d07e5c478-catalog-content\") pod \"certified-operators-w8mtm\" (UID: \"f25e9ca9-3a26-4c68-84d7-e91d07e5c478\") " pod="openshift-marketplace/certified-operators-w8mtm" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.140801 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdbm4\" (UniqueName: \"kubernetes.io/projected/f25e9ca9-3a26-4c68-84d7-e91d07e5c478-kube-api-access-gdbm4\") pod \"certified-operators-w8mtm\" (UID: \"f25e9ca9-3a26-4c68-84d7-e91d07e5c478\") " pod="openshift-marketplace/certified-operators-w8mtm" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.243248 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdbm4\" (UniqueName: \"kubernetes.io/projected/f25e9ca9-3a26-4c68-84d7-e91d07e5c478-kube-api-access-gdbm4\") pod \"certified-operators-w8mtm\" (UID: \"f25e9ca9-3a26-4c68-84d7-e91d07e5c478\") " pod="openshift-marketplace/certified-operators-w8mtm" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.243601 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f25e9ca9-3a26-4c68-84d7-e91d07e5c478-utilities\") pod \"certified-operators-w8mtm\" (UID: \"f25e9ca9-3a26-4c68-84d7-e91d07e5c478\") " pod="openshift-marketplace/certified-operators-w8mtm" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.243761 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f25e9ca9-3a26-4c68-84d7-e91d07e5c478-catalog-content\") pod \"certified-operators-w8mtm\" (UID: \"f25e9ca9-3a26-4c68-84d7-e91d07e5c478\") " pod="openshift-marketplace/certified-operators-w8mtm" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.244216 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f25e9ca9-3a26-4c68-84d7-e91d07e5c478-utilities\") pod \"certified-operators-w8mtm\" (UID: \"f25e9ca9-3a26-4c68-84d7-e91d07e5c478\") " pod="openshift-marketplace/certified-operators-w8mtm" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.244441 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f25e9ca9-3a26-4c68-84d7-e91d07e5c478-catalog-content\") pod \"certified-operators-w8mtm\" (UID: \"f25e9ca9-3a26-4c68-84d7-e91d07e5c478\") " pod="openshift-marketplace/certified-operators-w8mtm" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.249468 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.266767 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdbm4\" (UniqueName: \"kubernetes.io/projected/f25e9ca9-3a26-4c68-84d7-e91d07e5c478-kube-api-access-gdbm4\") pod \"certified-operators-w8mtm\" (UID: \"f25e9ca9-3a26-4c68-84d7-e91d07e5c478\") " pod="openshift-marketplace/certified-operators-w8mtm" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.424875 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w8mtm" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.446505 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-config-data\") pod \"7d543631-bf8a-46c4-8808-ed0e3c563189\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.446820 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-internal-tls-certs\") pod \"7d543631-bf8a-46c4-8808-ed0e3c563189\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.446986 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-scripts\") pod \"7d543631-bf8a-46c4-8808-ed0e3c563189\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.447077 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-public-tls-certs\") pod \"7d543631-bf8a-46c4-8808-ed0e3c563189\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.447339 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-combined-ca-bundle\") pod \"7d543631-bf8a-46c4-8808-ed0e3c563189\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.447445 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8cp4\" (UniqueName: \"kubernetes.io/projected/7d543631-bf8a-46c4-8808-ed0e3c563189-kube-api-access-f8cp4\") pod \"7d543631-bf8a-46c4-8808-ed0e3c563189\" (UID: \"7d543631-bf8a-46c4-8808-ed0e3c563189\") " Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.450410 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-scripts" (OuterVolumeSpecName: "scripts") pod "7d543631-bf8a-46c4-8808-ed0e3c563189" (UID: "7d543631-bf8a-46c4-8808-ed0e3c563189"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.458464 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d543631-bf8a-46c4-8808-ed0e3c563189-kube-api-access-f8cp4" (OuterVolumeSpecName: "kube-api-access-f8cp4") pod "7d543631-bf8a-46c4-8808-ed0e3c563189" (UID: "7d543631-bf8a-46c4-8808-ed0e3c563189"). InnerVolumeSpecName "kube-api-access-f8cp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.518414 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7d543631-bf8a-46c4-8808-ed0e3c563189" (UID: "7d543631-bf8a-46c4-8808-ed0e3c563189"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.524797 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7d543631-bf8a-46c4-8808-ed0e3c563189" (UID: "7d543631-bf8a-46c4-8808-ed0e3c563189"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.549841 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.549881 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.549894 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.549907 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8cp4\" (UniqueName: \"kubernetes.io/projected/7d543631-bf8a-46c4-8808-ed0e3c563189-kube-api-access-f8cp4\") on node \"crc\" DevicePath \"\"" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.613613 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-config-data" (OuterVolumeSpecName: "config-data") pod "7d543631-bf8a-46c4-8808-ed0e3c563189" (UID: "7d543631-bf8a-46c4-8808-ed0e3c563189"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.617980 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d543631-bf8a-46c4-8808-ed0e3c563189" (UID: "7d543631-bf8a-46c4-8808-ed0e3c563189"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.653772 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 10:09:06 crc kubenswrapper[4841]: I0313 10:09:06.653815 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d543631-bf8a-46c4-8808-ed0e3c563189-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:06.997875 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w8mtm"] Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.115442 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7d543631-bf8a-46c4-8808-ed0e3c563189","Type":"ContainerDied","Data":"da942202f4e46bd5870253f50606ee0870a07a92c0eb051f9de39e8043d42aeb"} Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.115840 4841 scope.go:117] "RemoveContainer" containerID="962a89540a477c8177f10d4e2937b469979eb748031b076446c1ab4689c0b767" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.116026 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.122012 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8mtm" event={"ID":"f25e9ca9-3a26-4c68-84d7-e91d07e5c478","Type":"ContainerStarted","Data":"9ded6dda0f1d7e76f94d2c35d22f059b4f9317ecb87fe95346f19448683b327c"} Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.168422 4841 scope.go:117] "RemoveContainer" containerID="3af4ec93db84e46d7004fd103f200655792e0f4840a06679d756f0a4c974cb1f" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.177435 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.202916 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.216331 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 13 10:09:07 crc kubenswrapper[4841]: E0313 10:09:07.216863 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerName="aodh-api" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.216887 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerName="aodh-api" Mar 13 10:09:07 crc kubenswrapper[4841]: E0313 10:09:07.216931 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerName="aodh-notifier" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.216940 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerName="aodh-notifier" Mar 13 10:09:07 crc kubenswrapper[4841]: E0313 10:09:07.216959 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerName="aodh-listener" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.216969 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerName="aodh-listener" Mar 13 10:09:07 crc kubenswrapper[4841]: E0313 10:09:07.216980 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerName="aodh-evaluator" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.216986 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerName="aodh-evaluator" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.217188 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerName="aodh-api" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.217224 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerName="aodh-notifier" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.217234 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerName="aodh-listener" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.217247 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d543631-bf8a-46c4-8808-ed0e3c563189" containerName="aodh-evaluator" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.219217 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.222793 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.223147 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.223478 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.225699 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-7tx6j" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.225911 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.228426 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.243657 4841 scope.go:117] "RemoveContainer" containerID="3b2338cfdfe4d9e5129e8ada951e6926e25db525ffc008da8d01e0d424403d06" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.268408 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46839f95-04c1-47d3-b63c-c9e2d80b681a-internal-tls-certs\") pod \"aodh-0\" (UID: \"46839f95-04c1-47d3-b63c-c9e2d80b681a\") " pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.268489 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46839f95-04c1-47d3-b63c-c9e2d80b681a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"46839f95-04c1-47d3-b63c-c9e2d80b681a\") " pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.268540 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46839f95-04c1-47d3-b63c-c9e2d80b681a-config-data\") pod \"aodh-0\" (UID: \"46839f95-04c1-47d3-b63c-c9e2d80b681a\") " pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.268627 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46839f95-04c1-47d3-b63c-c9e2d80b681a-public-tls-certs\") pod \"aodh-0\" (UID: \"46839f95-04c1-47d3-b63c-c9e2d80b681a\") " pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.268648 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6bmx\" (UniqueName: \"kubernetes.io/projected/46839f95-04c1-47d3-b63c-c9e2d80b681a-kube-api-access-q6bmx\") pod \"aodh-0\" (UID: \"46839f95-04c1-47d3-b63c-c9e2d80b681a\") " pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.268762 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46839f95-04c1-47d3-b63c-c9e2d80b681a-scripts\") pod \"aodh-0\" (UID: \"46839f95-04c1-47d3-b63c-c9e2d80b681a\") " pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.303710 4841 scope.go:117] "RemoveContainer" containerID="36ad1ad9997573d72b2f9127d3165d7e2b06e4a6c4331c8d606d1b320fab6f13" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.370816 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46839f95-04c1-47d3-b63c-c9e2d80b681a-config-data\") pod \"aodh-0\" (UID: \"46839f95-04c1-47d3-b63c-c9e2d80b681a\") " pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.370973 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46839f95-04c1-47d3-b63c-c9e2d80b681a-public-tls-certs\") pod \"aodh-0\" (UID: \"46839f95-04c1-47d3-b63c-c9e2d80b681a\") " pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.371011 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6bmx\" (UniqueName: \"kubernetes.io/projected/46839f95-04c1-47d3-b63c-c9e2d80b681a-kube-api-access-q6bmx\") pod \"aodh-0\" (UID: \"46839f95-04c1-47d3-b63c-c9e2d80b681a\") " pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.371059 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46839f95-04c1-47d3-b63c-c9e2d80b681a-scripts\") pod \"aodh-0\" (UID: \"46839f95-04c1-47d3-b63c-c9e2d80b681a\") " pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.371096 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46839f95-04c1-47d3-b63c-c9e2d80b681a-internal-tls-certs\") pod \"aodh-0\" (UID: \"46839f95-04c1-47d3-b63c-c9e2d80b681a\") " pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.371146 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46839f95-04c1-47d3-b63c-c9e2d80b681a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"46839f95-04c1-47d3-b63c-c9e2d80b681a\") " pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.376448 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46839f95-04c1-47d3-b63c-c9e2d80b681a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"46839f95-04c1-47d3-b63c-c9e2d80b681a\") " pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.376448 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46839f95-04c1-47d3-b63c-c9e2d80b681a-scripts\") pod \"aodh-0\" (UID: \"46839f95-04c1-47d3-b63c-c9e2d80b681a\") " pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.377172 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46839f95-04c1-47d3-b63c-c9e2d80b681a-config-data\") pod \"aodh-0\" (UID: \"46839f95-04c1-47d3-b63c-c9e2d80b681a\") " pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.377704 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46839f95-04c1-47d3-b63c-c9e2d80b681a-internal-tls-certs\") pod \"aodh-0\" (UID: \"46839f95-04c1-47d3-b63c-c9e2d80b681a\") " pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.380458 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46839f95-04c1-47d3-b63c-c9e2d80b681a-public-tls-certs\") pod \"aodh-0\" (UID: \"46839f95-04c1-47d3-b63c-c9e2d80b681a\") " pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.388802 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6bmx\" (UniqueName: \"kubernetes.io/projected/46839f95-04c1-47d3-b63c-c9e2d80b681a-kube-api-access-q6bmx\") pod \"aodh-0\" (UID: \"46839f95-04c1-47d3-b63c-c9e2d80b681a\") " pod="openstack/aodh-0" Mar 13 10:09:07 crc kubenswrapper[4841]: I0313 10:09:07.577945 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 10:09:08 crc kubenswrapper[4841]: I0313 10:09:08.008681 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d543631-bf8a-46c4-8808-ed0e3c563189" path="/var/lib/kubelet/pods/7d543631-bf8a-46c4-8808-ed0e3c563189/volumes" Mar 13 10:09:08 crc kubenswrapper[4841]: I0313 10:09:08.009812 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 10:09:08 crc kubenswrapper[4841]: I0313 10:09:08.133062 4841 generic.go:334] "Generic (PLEG): container finished" podID="f25e9ca9-3a26-4c68-84d7-e91d07e5c478" containerID="a4e9c5d76c2ec699b063e0777953c089705e13b61f155bb90c487667a6018594" exitCode=0 Mar 13 10:09:08 crc kubenswrapper[4841]: I0313 10:09:08.133177 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8mtm" event={"ID":"f25e9ca9-3a26-4c68-84d7-e91d07e5c478","Type":"ContainerDied","Data":"a4e9c5d76c2ec699b063e0777953c089705e13b61f155bb90c487667a6018594"} Mar 13 10:09:08 crc kubenswrapper[4841]: I0313 10:09:08.134524 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"46839f95-04c1-47d3-b63c-c9e2d80b681a","Type":"ContainerStarted","Data":"2683816c61b5617956cb41df83d01405b6168c3f37ce6448096a9fb7f349089a"} Mar 13 10:09:09 crc kubenswrapper[4841]: I0313 10:09:09.147435 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8mtm" event={"ID":"f25e9ca9-3a26-4c68-84d7-e91d07e5c478","Type":"ContainerStarted","Data":"9307c4682fc50d0cad6710f7a53a4ac4673b9be149ce682d98bd58754921737f"} Mar 13 10:09:09 crc kubenswrapper[4841]: I0313 10:09:09.149082 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"46839f95-04c1-47d3-b63c-c9e2d80b681a","Type":"ContainerStarted","Data":"1f4408b3a73e33b82f1f01e97e11cd8cd7cf59c9497970fb7d971762625b565b"} Mar 13 10:09:10 crc kubenswrapper[4841]: I0313 10:09:10.165532 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"46839f95-04c1-47d3-b63c-c9e2d80b681a","Type":"ContainerStarted","Data":"704340c438c6480eb11edeb3ea4ae5509ba4258b2d7cc685c630703d4402ca75"} Mar 13 10:09:11 crc kubenswrapper[4841]: I0313 10:09:11.179564 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"46839f95-04c1-47d3-b63c-c9e2d80b681a","Type":"ContainerStarted","Data":"ca1cb406f44540344327d3e6a9ec5d07229648b9349b5011d410396845a9761b"} Mar 13 10:09:11 crc kubenswrapper[4841]: I0313 10:09:11.182460 4841 generic.go:334] "Generic (PLEG): container finished" podID="f25e9ca9-3a26-4c68-84d7-e91d07e5c478" containerID="9307c4682fc50d0cad6710f7a53a4ac4673b9be149ce682d98bd58754921737f" exitCode=0 Mar 13 10:09:11 crc kubenswrapper[4841]: I0313 10:09:11.182519 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8mtm" event={"ID":"f25e9ca9-3a26-4c68-84d7-e91d07e5c478","Type":"ContainerDied","Data":"9307c4682fc50d0cad6710f7a53a4ac4673b9be149ce682d98bd58754921737f"} Mar 13 10:09:12 crc kubenswrapper[4841]: I0313 10:09:12.193783 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"46839f95-04c1-47d3-b63c-c9e2d80b681a","Type":"ContainerStarted","Data":"aec1afcbefa3ad774720c179d17a839ef3e8444212868857f122c37a305d0233"} Mar 13 10:09:12 crc kubenswrapper[4841]: I0313 10:09:12.218049 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.236477589 podStartE2EDuration="5.218027094s" podCreationTimestamp="2026-03-13 10:09:07 +0000 UTC" firstStartedPulling="2026-03-13 10:09:08.025992594 +0000 UTC m=+3430.755892805" lastFinishedPulling="2026-03-13 10:09:11.007542119 +0000 UTC m=+3433.737442310" observedRunningTime="2026-03-13 10:09:12.214432842 +0000 UTC m=+3434.944333023" watchObservedRunningTime="2026-03-13 10:09:12.218027094 +0000 UTC m=+3434.947927285" Mar 13 10:09:13 crc kubenswrapper[4841]: I0313 10:09:13.219944 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8mtm" event={"ID":"f25e9ca9-3a26-4c68-84d7-e91d07e5c478","Type":"ContainerStarted","Data":"a5547c6e850b05330a1b8a321e3f1ec59bf9091651eb28294ab02655b5202e03"} Mar 13 10:09:13 crc kubenswrapper[4841]: I0313 10:09:13.242156 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w8mtm" podStartSLOduration=4.340788979 podStartE2EDuration="8.242138021s" podCreationTimestamp="2026-03-13 10:09:05 +0000 UTC" firstStartedPulling="2026-03-13 10:09:08.134605358 +0000 UTC m=+3430.864505549" lastFinishedPulling="2026-03-13 10:09:12.03595439 +0000 UTC m=+3434.765854591" observedRunningTime="2026-03-13 10:09:13.232945642 +0000 UTC m=+3435.962845833" watchObservedRunningTime="2026-03-13 10:09:13.242138021 +0000 UTC m=+3435.972038212" Mar 13 10:09:16 crc kubenswrapper[4841]: I0313 10:09:16.426020 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w8mtm" Mar 13 10:09:16 crc kubenswrapper[4841]: I0313 10:09:16.426424 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w8mtm" Mar 13 10:09:16 crc kubenswrapper[4841]: I0313 10:09:16.483408 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w8mtm" Mar 13 10:09:17 crc kubenswrapper[4841]: I0313 10:09:17.320875 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w8mtm" Mar 13 10:09:18 crc kubenswrapper[4841]: I0313 10:09:18.177786 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w8mtm"] Mar 13 10:09:19 crc kubenswrapper[4841]: I0313 10:09:19.270601 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w8mtm" podUID="f25e9ca9-3a26-4c68-84d7-e91d07e5c478" containerName="registry-server" containerID="cri-o://a5547c6e850b05330a1b8a321e3f1ec59bf9091651eb28294ab02655b5202e03" gracePeriod=2 Mar 13 10:09:19 crc kubenswrapper[4841]: I0313 10:09:19.773586 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w8mtm" Mar 13 10:09:19 crc kubenswrapper[4841]: I0313 10:09:19.928277 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdbm4\" (UniqueName: \"kubernetes.io/projected/f25e9ca9-3a26-4c68-84d7-e91d07e5c478-kube-api-access-gdbm4\") pod \"f25e9ca9-3a26-4c68-84d7-e91d07e5c478\" (UID: \"f25e9ca9-3a26-4c68-84d7-e91d07e5c478\") " Mar 13 10:09:19 crc kubenswrapper[4841]: I0313 10:09:19.928417 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f25e9ca9-3a26-4c68-84d7-e91d07e5c478-utilities\") pod \"f25e9ca9-3a26-4c68-84d7-e91d07e5c478\" (UID: \"f25e9ca9-3a26-4c68-84d7-e91d07e5c478\") " Mar 13 10:09:19 crc kubenswrapper[4841]: I0313 10:09:19.928588 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f25e9ca9-3a26-4c68-84d7-e91d07e5c478-catalog-content\") pod \"f25e9ca9-3a26-4c68-84d7-e91d07e5c478\" (UID: \"f25e9ca9-3a26-4c68-84d7-e91d07e5c478\") " Mar 13 10:09:19 crc kubenswrapper[4841]: I0313 10:09:19.929390 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f25e9ca9-3a26-4c68-84d7-e91d07e5c478-utilities" (OuterVolumeSpecName: "utilities") pod "f25e9ca9-3a26-4c68-84d7-e91d07e5c478" (UID: "f25e9ca9-3a26-4c68-84d7-e91d07e5c478"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:09:19 crc kubenswrapper[4841]: I0313 10:09:19.934535 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f25e9ca9-3a26-4c68-84d7-e91d07e5c478-kube-api-access-gdbm4" (OuterVolumeSpecName: "kube-api-access-gdbm4") pod "f25e9ca9-3a26-4c68-84d7-e91d07e5c478" (UID: "f25e9ca9-3a26-4c68-84d7-e91d07e5c478"). InnerVolumeSpecName "kube-api-access-gdbm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.031179 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f25e9ca9-3a26-4c68-84d7-e91d07e5c478-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.031239 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdbm4\" (UniqueName: \"kubernetes.io/projected/f25e9ca9-3a26-4c68-84d7-e91d07e5c478-kube-api-access-gdbm4\") on node \"crc\" DevicePath \"\"" Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.073598 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f25e9ca9-3a26-4c68-84d7-e91d07e5c478-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f25e9ca9-3a26-4c68-84d7-e91d07e5c478" (UID: "f25e9ca9-3a26-4c68-84d7-e91d07e5c478"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.132905 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f25e9ca9-3a26-4c68-84d7-e91d07e5c478-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.284664 4841 generic.go:334] "Generic (PLEG): container finished" podID="f25e9ca9-3a26-4c68-84d7-e91d07e5c478" containerID="a5547c6e850b05330a1b8a321e3f1ec59bf9091651eb28294ab02655b5202e03" exitCode=0 Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.284768 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8mtm" event={"ID":"f25e9ca9-3a26-4c68-84d7-e91d07e5c478","Type":"ContainerDied","Data":"a5547c6e850b05330a1b8a321e3f1ec59bf9091651eb28294ab02655b5202e03"} Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.284830 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w8mtm" event={"ID":"f25e9ca9-3a26-4c68-84d7-e91d07e5c478","Type":"ContainerDied","Data":"9ded6dda0f1d7e76f94d2c35d22f059b4f9317ecb87fe95346f19448683b327c"} Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.284833 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w8mtm" Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.284854 4841 scope.go:117] "RemoveContainer" containerID="a5547c6e850b05330a1b8a321e3f1ec59bf9091651eb28294ab02655b5202e03" Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.311945 4841 scope.go:117] "RemoveContainer" containerID="9307c4682fc50d0cad6710f7a53a4ac4673b9be149ce682d98bd58754921737f" Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.348256 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w8mtm"] Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.363749 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w8mtm"] Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.375859 4841 scope.go:117] "RemoveContainer" containerID="a4e9c5d76c2ec699b063e0777953c089705e13b61f155bb90c487667a6018594" Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.409324 4841 scope.go:117] "RemoveContainer" containerID="a5547c6e850b05330a1b8a321e3f1ec59bf9091651eb28294ab02655b5202e03" Mar 13 10:09:20 crc kubenswrapper[4841]: E0313 10:09:20.409908 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5547c6e850b05330a1b8a321e3f1ec59bf9091651eb28294ab02655b5202e03\": container with ID starting with a5547c6e850b05330a1b8a321e3f1ec59bf9091651eb28294ab02655b5202e03 not found: ID does not exist" containerID="a5547c6e850b05330a1b8a321e3f1ec59bf9091651eb28294ab02655b5202e03" Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.409950 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5547c6e850b05330a1b8a321e3f1ec59bf9091651eb28294ab02655b5202e03"} err="failed to get container status \"a5547c6e850b05330a1b8a321e3f1ec59bf9091651eb28294ab02655b5202e03\": rpc error: code = NotFound desc = could not find container \"a5547c6e850b05330a1b8a321e3f1ec59bf9091651eb28294ab02655b5202e03\": container with ID starting with a5547c6e850b05330a1b8a321e3f1ec59bf9091651eb28294ab02655b5202e03 not found: ID does not exist" Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.409979 4841 scope.go:117] "RemoveContainer" containerID="9307c4682fc50d0cad6710f7a53a4ac4673b9be149ce682d98bd58754921737f" Mar 13 10:09:20 crc kubenswrapper[4841]: E0313 10:09:20.410503 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9307c4682fc50d0cad6710f7a53a4ac4673b9be149ce682d98bd58754921737f\": container with ID starting with 9307c4682fc50d0cad6710f7a53a4ac4673b9be149ce682d98bd58754921737f not found: ID does not exist" containerID="9307c4682fc50d0cad6710f7a53a4ac4673b9be149ce682d98bd58754921737f" Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.410535 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9307c4682fc50d0cad6710f7a53a4ac4673b9be149ce682d98bd58754921737f"} err="failed to get container status \"9307c4682fc50d0cad6710f7a53a4ac4673b9be149ce682d98bd58754921737f\": rpc error: code = NotFound desc = could not find container \"9307c4682fc50d0cad6710f7a53a4ac4673b9be149ce682d98bd58754921737f\": container with ID starting with 9307c4682fc50d0cad6710f7a53a4ac4673b9be149ce682d98bd58754921737f not found: ID does not exist" Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.410560 4841 scope.go:117] "RemoveContainer" containerID="a4e9c5d76c2ec699b063e0777953c089705e13b61f155bb90c487667a6018594" Mar 13 10:09:20 crc kubenswrapper[4841]: E0313 10:09:20.411008 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4e9c5d76c2ec699b063e0777953c089705e13b61f155bb90c487667a6018594\": container with ID starting with a4e9c5d76c2ec699b063e0777953c089705e13b61f155bb90c487667a6018594 not found: ID does not exist" containerID="a4e9c5d76c2ec699b063e0777953c089705e13b61f155bb90c487667a6018594" Mar 13 10:09:20 crc kubenswrapper[4841]: I0313 10:09:20.411056 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4e9c5d76c2ec699b063e0777953c089705e13b61f155bb90c487667a6018594"} err="failed to get container status \"a4e9c5d76c2ec699b063e0777953c089705e13b61f155bb90c487667a6018594\": rpc error: code = NotFound desc = could not find container \"a4e9c5d76c2ec699b063e0777953c089705e13b61f155bb90c487667a6018594\": container with ID starting with a4e9c5d76c2ec699b063e0777953c089705e13b61f155bb90c487667a6018594 not found: ID does not exist" Mar 13 10:09:22 crc kubenswrapper[4841]: I0313 10:09:22.012883 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f25e9ca9-3a26-4c68-84d7-e91d07e5c478" path="/var/lib/kubelet/pods/f25e9ca9-3a26-4c68-84d7-e91d07e5c478/volumes" Mar 13 10:09:34 crc kubenswrapper[4841]: I0313 10:09:34.407079 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 10:09:34 crc kubenswrapper[4841]: I0313 10:09:34.407561 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 10:10:00 crc kubenswrapper[4841]: I0313 10:10:00.153207 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556610-94kqr"] Mar 13 10:10:00 crc kubenswrapper[4841]: E0313 10:10:00.154360 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25e9ca9-3a26-4c68-84d7-e91d07e5c478" containerName="registry-server" Mar 13 10:10:00 crc kubenswrapper[4841]: I0313 10:10:00.154378 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25e9ca9-3a26-4c68-84d7-e91d07e5c478" containerName="registry-server" Mar 13 10:10:00 crc kubenswrapper[4841]: E0313 10:10:00.154417 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25e9ca9-3a26-4c68-84d7-e91d07e5c478" containerName="extract-utilities" Mar 13 10:10:00 crc kubenswrapper[4841]: I0313 10:10:00.154425 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25e9ca9-3a26-4c68-84d7-e91d07e5c478" containerName="extract-utilities" Mar 13 10:10:00 crc kubenswrapper[4841]: E0313 10:10:00.154443 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25e9ca9-3a26-4c68-84d7-e91d07e5c478" containerName="extract-content" Mar 13 10:10:00 crc kubenswrapper[4841]: I0313 10:10:00.154451 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25e9ca9-3a26-4c68-84d7-e91d07e5c478" containerName="extract-content" Mar 13 10:10:00 crc kubenswrapper[4841]: I0313 10:10:00.154648 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25e9ca9-3a26-4c68-84d7-e91d07e5c478" containerName="registry-server" Mar 13 10:10:00 crc kubenswrapper[4841]: I0313 10:10:00.155360 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556610-94kqr" Mar 13 10:10:00 crc kubenswrapper[4841]: I0313 10:10:00.157524 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 10:10:00 crc kubenswrapper[4841]: I0313 10:10:00.157967 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 10:10:00 crc kubenswrapper[4841]: I0313 10:10:00.158459 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 10:10:00 crc kubenswrapper[4841]: I0313 10:10:00.166761 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556610-94kqr"] Mar 13 10:10:00 crc kubenswrapper[4841]: I0313 10:10:00.286209 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csx6l\" (UniqueName: \"kubernetes.io/projected/238e9451-eeab-4566-bb9e-3abdaa91978c-kube-api-access-csx6l\") pod \"auto-csr-approver-29556610-94kqr\" (UID: \"238e9451-eeab-4566-bb9e-3abdaa91978c\") " pod="openshift-infra/auto-csr-approver-29556610-94kqr" Mar 13 10:10:00 crc kubenswrapper[4841]: I0313 10:10:00.388710 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csx6l\" (UniqueName: \"kubernetes.io/projected/238e9451-eeab-4566-bb9e-3abdaa91978c-kube-api-access-csx6l\") pod \"auto-csr-approver-29556610-94kqr\" (UID: \"238e9451-eeab-4566-bb9e-3abdaa91978c\") " pod="openshift-infra/auto-csr-approver-29556610-94kqr" Mar 13 10:10:00 crc kubenswrapper[4841]: I0313 10:10:00.410378 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csx6l\" (UniqueName: \"kubernetes.io/projected/238e9451-eeab-4566-bb9e-3abdaa91978c-kube-api-access-csx6l\") pod \"auto-csr-approver-29556610-94kqr\" (UID: \"238e9451-eeab-4566-bb9e-3abdaa91978c\") " pod="openshift-infra/auto-csr-approver-29556610-94kqr" Mar 13 10:10:00 crc kubenswrapper[4841]: I0313 10:10:00.489725 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556610-94kqr" Mar 13 10:10:00 crc kubenswrapper[4841]: I0313 10:10:00.996357 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556610-94kqr"] Mar 13 10:10:01 crc kubenswrapper[4841]: I0313 10:10:01.678596 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556610-94kqr" event={"ID":"238e9451-eeab-4566-bb9e-3abdaa91978c","Type":"ContainerStarted","Data":"c0436b50a715970302c5fcfcd646e4d7dcade45f46e4222537b31a6e9654e7bf"} Mar 13 10:10:02 crc kubenswrapper[4841]: I0313 10:10:02.687714 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556610-94kqr" event={"ID":"238e9451-eeab-4566-bb9e-3abdaa91978c","Type":"ContainerStarted","Data":"ce31be1973248023d9b9155a156c3fa9c96133a4fa3b41a43455b1fe4db14dd2"} Mar 13 10:10:02 crc kubenswrapper[4841]: I0313 10:10:02.707550 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556610-94kqr" podStartSLOduration=1.401750094 podStartE2EDuration="2.707533986s" podCreationTimestamp="2026-03-13 10:10:00 +0000 UTC" firstStartedPulling="2026-03-13 10:10:00.998756935 +0000 UTC m=+3483.728657126" lastFinishedPulling="2026-03-13 10:10:02.304540827 +0000 UTC m=+3485.034441018" observedRunningTime="2026-03-13 10:10:02.698881804 +0000 UTC m=+3485.428781995" watchObservedRunningTime="2026-03-13 10:10:02.707533986 +0000 UTC m=+3485.437434167" Mar 13 10:10:03 crc kubenswrapper[4841]: I0313 10:10:03.707527 4841 generic.go:334] "Generic (PLEG): container finished" podID="238e9451-eeab-4566-bb9e-3abdaa91978c" containerID="ce31be1973248023d9b9155a156c3fa9c96133a4fa3b41a43455b1fe4db14dd2" exitCode=0 Mar 13 10:10:03 crc kubenswrapper[4841]: I0313 10:10:03.707665 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556610-94kqr" event={"ID":"238e9451-eeab-4566-bb9e-3abdaa91978c","Type":"ContainerDied","Data":"ce31be1973248023d9b9155a156c3fa9c96133a4fa3b41a43455b1fe4db14dd2"} Mar 13 10:10:04 crc kubenswrapper[4841]: I0313 10:10:04.407053 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 10:10:04 crc kubenswrapper[4841]: I0313 10:10:04.407137 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 10:10:04 crc kubenswrapper[4841]: I0313 10:10:04.407195 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 10:10:04 crc kubenswrapper[4841]: I0313 10:10:04.408199 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"345fb0919d4b00a63b8564b52ef5b612a0955f49c27fe5b9f30c8993e805394a"} pod="openshift-machine-config-operator/machine-config-daemon-h227v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 10:10:04 crc kubenswrapper[4841]: I0313 10:10:04.408297 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" containerID="cri-o://345fb0919d4b00a63b8564b52ef5b612a0955f49c27fe5b9f30c8993e805394a" gracePeriod=600 Mar 13 10:10:04 crc kubenswrapper[4841]: I0313 10:10:04.725105 4841 generic.go:334] "Generic (PLEG): container finished" podID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerID="345fb0919d4b00a63b8564b52ef5b612a0955f49c27fe5b9f30c8993e805394a" exitCode=0 Mar 13 10:10:04 crc kubenswrapper[4841]: I0313 10:10:04.725167 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerDied","Data":"345fb0919d4b00a63b8564b52ef5b612a0955f49c27fe5b9f30c8993e805394a"} Mar 13 10:10:04 crc kubenswrapper[4841]: I0313 10:10:04.725440 4841 scope.go:117] "RemoveContainer" containerID="e83b4eb32bd758ccab19563b876e9a0e16fcf588296f870fd9d90bf3f88ec74b" Mar 13 10:10:05 crc kubenswrapper[4841]: I0313 10:10:05.058285 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556610-94kqr" Mar 13 10:10:05 crc kubenswrapper[4841]: I0313 10:10:05.191303 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csx6l\" (UniqueName: \"kubernetes.io/projected/238e9451-eeab-4566-bb9e-3abdaa91978c-kube-api-access-csx6l\") pod \"238e9451-eeab-4566-bb9e-3abdaa91978c\" (UID: \"238e9451-eeab-4566-bb9e-3abdaa91978c\") " Mar 13 10:10:05 crc kubenswrapper[4841]: I0313 10:10:05.201568 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/238e9451-eeab-4566-bb9e-3abdaa91978c-kube-api-access-csx6l" (OuterVolumeSpecName: "kube-api-access-csx6l") pod "238e9451-eeab-4566-bb9e-3abdaa91978c" (UID: "238e9451-eeab-4566-bb9e-3abdaa91978c"). InnerVolumeSpecName "kube-api-access-csx6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:10:05 crc kubenswrapper[4841]: I0313 10:10:05.294607 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csx6l\" (UniqueName: \"kubernetes.io/projected/238e9451-eeab-4566-bb9e-3abdaa91978c-kube-api-access-csx6l\") on node \"crc\" DevicePath \"\"" Mar 13 10:10:05 crc kubenswrapper[4841]: I0313 10:10:05.749813 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912"} Mar 13 10:10:05 crc kubenswrapper[4841]: I0313 10:10:05.753835 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556610-94kqr" event={"ID":"238e9451-eeab-4566-bb9e-3abdaa91978c","Type":"ContainerDied","Data":"c0436b50a715970302c5fcfcd646e4d7dcade45f46e4222537b31a6e9654e7bf"} Mar 13 10:10:05 crc kubenswrapper[4841]: I0313 10:10:05.753891 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0436b50a715970302c5fcfcd646e4d7dcade45f46e4222537b31a6e9654e7bf" Mar 13 10:10:05 crc kubenswrapper[4841]: I0313 10:10:05.753926 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556610-94kqr" Mar 13 10:10:05 crc kubenswrapper[4841]: I0313 10:10:05.792670 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556604-8sknz"] Mar 13 10:10:05 crc kubenswrapper[4841]: I0313 10:10:05.801198 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556604-8sknz"] Mar 13 10:10:06 crc kubenswrapper[4841]: I0313 10:10:06.019937 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f84e5588-63df-41c9-98c7-00d3fc4db098" path="/var/lib/kubelet/pods/f84e5588-63df-41c9-98c7-00d3fc4db098/volumes" Mar 13 10:10:06 crc kubenswrapper[4841]: I0313 10:10:06.297300 4841 scope.go:117] "RemoveContainer" containerID="83eca72557fd86e5bdc54519f8860e4988c13a432b4be4c841422700a869c39e" Mar 13 10:10:36 crc kubenswrapper[4841]: I0313 10:10:36.148700 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-8485bdb9db-mf5lp_17824e5f-18b3-46c0-910a-56e5529e09c3/manager/0.log" Mar 13 10:10:39 crc kubenswrapper[4841]: I0313 10:10:39.637705 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:10:39 crc kubenswrapper[4841]: I0313 10:10:39.638624 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4bb62be2-5fc7-4365-994e-cefee90fa78b" containerName="prometheus" containerID="cri-o://1b5b6627f7b5f9de27b6cec1d1ebcd390db0c381a412633050da7e589eb933eb" gracePeriod=600 Mar 13 10:10:39 crc kubenswrapper[4841]: I0313 10:10:39.638663 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4bb62be2-5fc7-4365-994e-cefee90fa78b" containerName="thanos-sidecar" containerID="cri-o://15879c33b73fb5f09b91bb21a796081b5983bb80aea32f63895794657d1c9c8b" gracePeriod=600 Mar 13 10:10:39 crc kubenswrapper[4841]: I0313 10:10:39.638669 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4bb62be2-5fc7-4365-994e-cefee90fa78b" containerName="config-reloader" containerID="cri-o://adbf6eed83415549c2a34785e20af0bb6a61957d22afa76e5b12e37d9cd388f1" gracePeriod=600 Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.112172 4841 generic.go:334] "Generic (PLEG): container finished" podID="4bb62be2-5fc7-4365-994e-cefee90fa78b" containerID="15879c33b73fb5f09b91bb21a796081b5983bb80aea32f63895794657d1c9c8b" exitCode=0 Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.112208 4841 generic.go:334] "Generic (PLEG): container finished" podID="4bb62be2-5fc7-4365-994e-cefee90fa78b" containerID="adbf6eed83415549c2a34785e20af0bb6a61957d22afa76e5b12e37d9cd388f1" exitCode=0 Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.112221 4841 generic.go:334] "Generic (PLEG): container finished" podID="4bb62be2-5fc7-4365-994e-cefee90fa78b" containerID="1b5b6627f7b5f9de27b6cec1d1ebcd390db0c381a412633050da7e589eb933eb" exitCode=0 Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.112243 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4bb62be2-5fc7-4365-994e-cefee90fa78b","Type":"ContainerDied","Data":"15879c33b73fb5f09b91bb21a796081b5983bb80aea32f63895794657d1c9c8b"} Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.112300 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4bb62be2-5fc7-4365-994e-cefee90fa78b","Type":"ContainerDied","Data":"adbf6eed83415549c2a34785e20af0bb6a61957d22afa76e5b12e37d9cd388f1"} Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.112315 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4bb62be2-5fc7-4365-994e-cefee90fa78b","Type":"ContainerDied","Data":"1b5b6627f7b5f9de27b6cec1d1ebcd390db0c381a412633050da7e589eb933eb"} Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.582169 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.757387 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-thanos-prometheus-http-client-file\") pod \"4bb62be2-5fc7-4365-994e-cefee90fa78b\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.757447 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4bb62be2-5fc7-4365-994e-cefee90fa78b-config-out\") pod \"4bb62be2-5fc7-4365-994e-cefee90fa78b\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.757490 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-config\") pod \"4bb62be2-5fc7-4365-994e-cefee90fa78b\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.757577 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-web-config\") pod \"4bb62be2-5fc7-4365-994e-cefee90fa78b\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.757603 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"4bb62be2-5fc7-4365-994e-cefee90fa78b\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.757633 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"4bb62be2-5fc7-4365-994e-cefee90fa78b\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.757662 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-db\") pod \"4bb62be2-5fc7-4365-994e-cefee90fa78b\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.757755 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-rulefiles-1\") pod \"4bb62be2-5fc7-4365-994e-cefee90fa78b\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.757834 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-rulefiles-0\") pod \"4bb62be2-5fc7-4365-994e-cefee90fa78b\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.758214 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "4bb62be2-5fc7-4365-994e-cefee90fa78b" (UID: "4bb62be2-5fc7-4365-994e-cefee90fa78b"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.758338 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-secret-combined-ca-bundle\") pod \"4bb62be2-5fc7-4365-994e-cefee90fa78b\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.758441 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "4bb62be2-5fc7-4365-994e-cefee90fa78b" (UID: "4bb62be2-5fc7-4365-994e-cefee90fa78b"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.758521 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-db" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "4bb62be2-5fc7-4365-994e-cefee90fa78b" (UID: "4bb62be2-5fc7-4365-994e-cefee90fa78b"). InnerVolumeSpecName "prometheus-metric-storage-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.758754 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4bb62be2-5fc7-4365-994e-cefee90fa78b-tls-assets\") pod \"4bb62be2-5fc7-4365-994e-cefee90fa78b\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.758822 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-rulefiles-2\") pod \"4bb62be2-5fc7-4365-994e-cefee90fa78b\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.758894 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd47x\" (UniqueName: \"kubernetes.io/projected/4bb62be2-5fc7-4365-994e-cefee90fa78b-kube-api-access-cd47x\") pod \"4bb62be2-5fc7-4365-994e-cefee90fa78b\" (UID: \"4bb62be2-5fc7-4365-994e-cefee90fa78b\") " Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.759488 4841 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-db\") on node \"crc\" DevicePath \"\"" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.759514 4841 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.759528 4841 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.761851 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "4bb62be2-5fc7-4365-994e-cefee90fa78b" (UID: "4bb62be2-5fc7-4365-994e-cefee90fa78b"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.765160 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "4bb62be2-5fc7-4365-994e-cefee90fa78b" (UID: "4bb62be2-5fc7-4365-994e-cefee90fa78b"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.765510 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-config" (OuterVolumeSpecName: "config") pod "4bb62be2-5fc7-4365-994e-cefee90fa78b" (UID: "4bb62be2-5fc7-4365-994e-cefee90fa78b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.766055 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "4bb62be2-5fc7-4365-994e-cefee90fa78b" (UID: "4bb62be2-5fc7-4365-994e-cefee90fa78b"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.766415 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bb62be2-5fc7-4365-994e-cefee90fa78b-config-out" (OuterVolumeSpecName: "config-out") pod "4bb62be2-5fc7-4365-994e-cefee90fa78b" (UID: "4bb62be2-5fc7-4365-994e-cefee90fa78b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.767150 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb62be2-5fc7-4365-994e-cefee90fa78b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "4bb62be2-5fc7-4365-994e-cefee90fa78b" (UID: "4bb62be2-5fc7-4365-994e-cefee90fa78b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.769044 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "4bb62be2-5fc7-4365-994e-cefee90fa78b" (UID: "4bb62be2-5fc7-4365-994e-cefee90fa78b"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.773509 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "4bb62be2-5fc7-4365-994e-cefee90fa78b" (UID: "4bb62be2-5fc7-4365-994e-cefee90fa78b"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.777124 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb62be2-5fc7-4365-994e-cefee90fa78b-kube-api-access-cd47x" (OuterVolumeSpecName: "kube-api-access-cd47x") pod "4bb62be2-5fc7-4365-994e-cefee90fa78b" (UID: "4bb62be2-5fc7-4365-994e-cefee90fa78b"). InnerVolumeSpecName "kube-api-access-cd47x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.848844 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-web-config" (OuterVolumeSpecName: "web-config") pod "4bb62be2-5fc7-4365-994e-cefee90fa78b" (UID: "4bb62be2-5fc7-4365-994e-cefee90fa78b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.863395 4841 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.863438 4841 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4bb62be2-5fc7-4365-994e-cefee90fa78b-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.863456 4841 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4bb62be2-5fc7-4365-994e-cefee90fa78b-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.863481 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd47x\" (UniqueName: \"kubernetes.io/projected/4bb62be2-5fc7-4365-994e-cefee90fa78b-kube-api-access-cd47x\") on node \"crc\" DevicePath \"\"" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.863496 4841 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.863507 4841 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4bb62be2-5fc7-4365-994e-cefee90fa78b-config-out\") on node \"crc\" DevicePath \"\"" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.863524 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-config\") on node \"crc\" DevicePath \"\"" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.863535 4841 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-web-config\") on node \"crc\" DevicePath \"\"" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.863548 4841 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Mar 13 10:10:40 crc kubenswrapper[4841]: I0313 10:10:40.863564 4841 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4bb62be2-5fc7-4365-994e-cefee90fa78b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.133709 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4bb62be2-5fc7-4365-994e-cefee90fa78b","Type":"ContainerDied","Data":"295214e7d8b983a391027ee1814c2242fed9730f3a6e4f358b3042d25bb39604"} Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.133762 4841 scope.go:117] "RemoveContainer" containerID="15879c33b73fb5f09b91bb21a796081b5983bb80aea32f63895794657d1c9c8b" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.133819 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.171477 4841 scope.go:117] "RemoveContainer" containerID="adbf6eed83415549c2a34785e20af0bb6a61957d22afa76e5b12e37d9cd388f1" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.183295 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.195073 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.197784 4841 scope.go:117] "RemoveContainer" containerID="1b5b6627f7b5f9de27b6cec1d1ebcd390db0c381a412633050da7e589eb933eb" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.217207 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:10:41 crc kubenswrapper[4841]: E0313 10:10:41.217611 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb62be2-5fc7-4365-994e-cefee90fa78b" containerName="thanos-sidecar" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.217631 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb62be2-5fc7-4365-994e-cefee90fa78b" containerName="thanos-sidecar" Mar 13 10:10:41 crc kubenswrapper[4841]: E0313 10:10:41.217666 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb62be2-5fc7-4365-994e-cefee90fa78b" containerName="config-reloader" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.217673 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb62be2-5fc7-4365-994e-cefee90fa78b" containerName="config-reloader" Mar 13 10:10:41 crc kubenswrapper[4841]: E0313 10:10:41.217686 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="238e9451-eeab-4566-bb9e-3abdaa91978c" containerName="oc" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.217692 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="238e9451-eeab-4566-bb9e-3abdaa91978c" containerName="oc" Mar 13 10:10:41 crc kubenswrapper[4841]: E0313 10:10:41.217704 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb62be2-5fc7-4365-994e-cefee90fa78b" containerName="prometheus" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.217709 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb62be2-5fc7-4365-994e-cefee90fa78b" containerName="prometheus" Mar 13 10:10:41 crc kubenswrapper[4841]: E0313 10:10:41.217727 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb62be2-5fc7-4365-994e-cefee90fa78b" containerName="init-config-reloader" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.217733 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb62be2-5fc7-4365-994e-cefee90fa78b" containerName="init-config-reloader" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.217896 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bb62be2-5fc7-4365-994e-cefee90fa78b" containerName="thanos-sidecar" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.217912 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="238e9451-eeab-4566-bb9e-3abdaa91978c" containerName="oc" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.217929 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bb62be2-5fc7-4365-994e-cefee90fa78b" containerName="prometheus" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.217939 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bb62be2-5fc7-4365-994e-cefee90fa78b" containerName="config-reloader" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.219642 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.227457 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.227457 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.227615 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-wmghm" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.227634 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.228307 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.228384 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.228458 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.228526 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.232783 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.240515 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.240962 4841 scope.go:117] "RemoveContainer" containerID="b746948972008c493c8635c3770b3b5d652a4235ce4e2137314ccd746d6095c5" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.382817 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.382856 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.382887 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.382907 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57zs9\" (UniqueName: \"kubernetes.io/projected/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-kube-api-access-57zs9\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.382998 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-config\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.383047 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.383103 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.383145 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.383219 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.383307 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.383343 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.383591 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.383747 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.485010 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.485696 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.486311 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.487109 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.487221 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.488045 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.488593 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.488812 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.488956 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.489025 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.489108 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.489183 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57zs9\" (UniqueName: \"kubernetes.io/projected/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-kube-api-access-57zs9\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.489280 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-config\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.489416 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.489549 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.491419 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.492546 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.492972 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.494803 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.495187 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.496228 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-config\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.497590 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.498200 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.498332 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.502963 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.514999 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57zs9\" (UniqueName: \"kubernetes.io/projected/cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7-kube-api-access-57zs9\") pod \"prometheus-metric-storage-0\" (UID: \"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7\") " pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:41 crc kubenswrapper[4841]: I0313 10:10:41.543498 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 10:10:42 crc kubenswrapper[4841]: I0313 10:10:42.011438 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb62be2-5fc7-4365-994e-cefee90fa78b" path="/var/lib/kubelet/pods/4bb62be2-5fc7-4365-994e-cefee90fa78b/volumes" Mar 13 10:10:42 crc kubenswrapper[4841]: I0313 10:10:42.117482 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 10:10:42 crc kubenswrapper[4841]: I0313 10:10:42.148699 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7","Type":"ContainerStarted","Data":"7307a4be3fa19b78e1ee1be2a0350ec18c42261148ea790dc8c881e0ddaad684"} Mar 13 10:10:46 crc kubenswrapper[4841]: I0313 10:10:46.185523 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7","Type":"ContainerStarted","Data":"0c40a0aee04b8236fd307778b05376d096ed408e6660a7b29bd09cc874173d14"} Mar 13 10:10:53 crc kubenswrapper[4841]: I0313 10:10:53.262696 4841 generic.go:334] "Generic (PLEG): container finished" podID="cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7" containerID="0c40a0aee04b8236fd307778b05376d096ed408e6660a7b29bd09cc874173d14" exitCode=0 Mar 13 10:10:53 crc kubenswrapper[4841]: I0313 10:10:53.262821 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7","Type":"ContainerDied","Data":"0c40a0aee04b8236fd307778b05376d096ed408e6660a7b29bd09cc874173d14"} Mar 13 10:10:54 crc kubenswrapper[4841]: I0313 10:10:54.272177 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7","Type":"ContainerStarted","Data":"51ad855196460c35848428974f1e68dc8c3c23689f36e75a9bcaa1a67bddbdbf"} Mar 13 10:10:57 crc kubenswrapper[4841]: I0313 10:10:57.306161 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7","Type":"ContainerStarted","Data":"8dd36feb0c3c27e6dc73cbd39ea1b8d25a19e791622c31f8c9564030dbb6a4fb"} Mar 13 10:10:57 crc kubenswrapper[4841]: I0313 10:10:57.306731 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7","Type":"ContainerStarted","Data":"b4a729c1972db686aee8bd8e5f3451e4276a13f453b21c19d75f1d6573f9da6e"} Mar 13 10:10:57 crc kubenswrapper[4841]: I0313 10:10:57.355533 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.355494241 podStartE2EDuration="16.355494241s" podCreationTimestamp="2026-03-13 10:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:10:57.353593451 +0000 UTC m=+3540.083493652" watchObservedRunningTime="2026-03-13 10:10:57.355494241 +0000 UTC m=+3540.085394492" Mar 13 10:11:01 crc kubenswrapper[4841]: I0313 10:11:01.544223 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 13 10:11:11 crc kubenswrapper[4841]: I0313 10:11:11.543624 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 13 10:11:11 crc kubenswrapper[4841]: I0313 10:11:11.548468 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 13 10:11:12 crc kubenswrapper[4841]: I0313 10:11:12.494228 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 13 10:11:25 crc kubenswrapper[4841]: I0313 10:11:25.294641 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vm22p"] Mar 13 10:11:25 crc kubenswrapper[4841]: I0313 10:11:25.297121 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vm22p" Mar 13 10:11:25 crc kubenswrapper[4841]: I0313 10:11:25.327716 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vm22p"] Mar 13 10:11:25 crc kubenswrapper[4841]: I0313 10:11:25.461747 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccab1406-028c-4f6b-a217-2be39e3d5cd4-catalog-content\") pod \"redhat-marketplace-vm22p\" (UID: \"ccab1406-028c-4f6b-a217-2be39e3d5cd4\") " pod="openshift-marketplace/redhat-marketplace-vm22p" Mar 13 10:11:25 crc kubenswrapper[4841]: I0313 10:11:25.461859 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccab1406-028c-4f6b-a217-2be39e3d5cd4-utilities\") pod \"redhat-marketplace-vm22p\" (UID: \"ccab1406-028c-4f6b-a217-2be39e3d5cd4\") " pod="openshift-marketplace/redhat-marketplace-vm22p" Mar 13 10:11:25 crc kubenswrapper[4841]: I0313 10:11:25.461891 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvjhz\" (UniqueName: \"kubernetes.io/projected/ccab1406-028c-4f6b-a217-2be39e3d5cd4-kube-api-access-hvjhz\") pod \"redhat-marketplace-vm22p\" (UID: \"ccab1406-028c-4f6b-a217-2be39e3d5cd4\") " pod="openshift-marketplace/redhat-marketplace-vm22p" Mar 13 10:11:25 crc kubenswrapper[4841]: I0313 10:11:25.563533 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccab1406-028c-4f6b-a217-2be39e3d5cd4-catalog-content\") pod \"redhat-marketplace-vm22p\" (UID: \"ccab1406-028c-4f6b-a217-2be39e3d5cd4\") " pod="openshift-marketplace/redhat-marketplace-vm22p" Mar 13 10:11:25 crc kubenswrapper[4841]: I0313 10:11:25.563599 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccab1406-028c-4f6b-a217-2be39e3d5cd4-utilities\") pod \"redhat-marketplace-vm22p\" (UID: \"ccab1406-028c-4f6b-a217-2be39e3d5cd4\") " pod="openshift-marketplace/redhat-marketplace-vm22p" Mar 13 10:11:25 crc kubenswrapper[4841]: I0313 10:11:25.563622 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvjhz\" (UniqueName: \"kubernetes.io/projected/ccab1406-028c-4f6b-a217-2be39e3d5cd4-kube-api-access-hvjhz\") pod \"redhat-marketplace-vm22p\" (UID: \"ccab1406-028c-4f6b-a217-2be39e3d5cd4\") " pod="openshift-marketplace/redhat-marketplace-vm22p" Mar 13 10:11:25 crc kubenswrapper[4841]: I0313 10:11:25.563962 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccab1406-028c-4f6b-a217-2be39e3d5cd4-catalog-content\") pod \"redhat-marketplace-vm22p\" (UID: \"ccab1406-028c-4f6b-a217-2be39e3d5cd4\") " pod="openshift-marketplace/redhat-marketplace-vm22p" Mar 13 10:11:25 crc kubenswrapper[4841]: I0313 10:11:25.564190 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccab1406-028c-4f6b-a217-2be39e3d5cd4-utilities\") pod \"redhat-marketplace-vm22p\" (UID: \"ccab1406-028c-4f6b-a217-2be39e3d5cd4\") " pod="openshift-marketplace/redhat-marketplace-vm22p" Mar 13 10:11:25 crc kubenswrapper[4841]: I0313 10:11:25.588246 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvjhz\" (UniqueName: \"kubernetes.io/projected/ccab1406-028c-4f6b-a217-2be39e3d5cd4-kube-api-access-hvjhz\") pod \"redhat-marketplace-vm22p\" (UID: \"ccab1406-028c-4f6b-a217-2be39e3d5cd4\") " pod="openshift-marketplace/redhat-marketplace-vm22p" Mar 13 10:11:25 crc kubenswrapper[4841]: I0313 10:11:25.620793 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vm22p" Mar 13 10:11:26 crc kubenswrapper[4841]: I0313 10:11:26.110372 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vm22p"] Mar 13 10:11:26 crc kubenswrapper[4841]: W0313 10:11:26.136942 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccab1406_028c_4f6b_a217_2be39e3d5cd4.slice/crio-893f037b3a3e379c166e02ed10a1c56dd0c39ea1a2f3c87d2d75af7b00490f49 WatchSource:0}: Error finding container 893f037b3a3e379c166e02ed10a1c56dd0c39ea1a2f3c87d2d75af7b00490f49: Status 404 returned error can't find the container with id 893f037b3a3e379c166e02ed10a1c56dd0c39ea1a2f3c87d2d75af7b00490f49 Mar 13 10:11:26 crc kubenswrapper[4841]: I0313 10:11:26.628719 4841 generic.go:334] "Generic (PLEG): container finished" podID="ccab1406-028c-4f6b-a217-2be39e3d5cd4" containerID="079b5d16272082bc6e324daf52cfb55287976cd415501a15dc46ead068999460" exitCode=0 Mar 13 10:11:26 crc kubenswrapper[4841]: I0313 10:11:26.628764 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vm22p" event={"ID":"ccab1406-028c-4f6b-a217-2be39e3d5cd4","Type":"ContainerDied","Data":"079b5d16272082bc6e324daf52cfb55287976cd415501a15dc46ead068999460"} Mar 13 10:11:26 crc kubenswrapper[4841]: I0313 10:11:26.629879 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vm22p" event={"ID":"ccab1406-028c-4f6b-a217-2be39e3d5cd4","Type":"ContainerStarted","Data":"893f037b3a3e379c166e02ed10a1c56dd0c39ea1a2f3c87d2d75af7b00490f49"} Mar 13 10:11:27 crc kubenswrapper[4841]: I0313 10:11:27.642912 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vm22p" event={"ID":"ccab1406-028c-4f6b-a217-2be39e3d5cd4","Type":"ContainerStarted","Data":"58e333fb1909d3b8e3da75fd4e5c7104b11eca798c3afd2a768c9de452d3224f"} Mar 13 10:11:28 crc kubenswrapper[4841]: I0313 10:11:28.653126 4841 generic.go:334] "Generic (PLEG): container finished" podID="ccab1406-028c-4f6b-a217-2be39e3d5cd4" containerID="58e333fb1909d3b8e3da75fd4e5c7104b11eca798c3afd2a768c9de452d3224f" exitCode=0 Mar 13 10:11:28 crc kubenswrapper[4841]: I0313 10:11:28.653168 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vm22p" event={"ID":"ccab1406-028c-4f6b-a217-2be39e3d5cd4","Type":"ContainerDied","Data":"58e333fb1909d3b8e3da75fd4e5c7104b11eca798c3afd2a768c9de452d3224f"} Mar 13 10:11:29 crc kubenswrapper[4841]: I0313 10:11:29.667000 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vm22p" event={"ID":"ccab1406-028c-4f6b-a217-2be39e3d5cd4","Type":"ContainerStarted","Data":"ecaf564b619291f4f3859d95ecf7a81df2432702b1e87481e707b17e13e57714"} Mar 13 10:11:29 crc kubenswrapper[4841]: I0313 10:11:29.701385 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vm22p" podStartSLOduration=2.098815579 podStartE2EDuration="4.701366249s" podCreationTimestamp="2026-03-13 10:11:25 +0000 UTC" firstStartedPulling="2026-03-13 10:11:26.631070484 +0000 UTC m=+3569.360970675" lastFinishedPulling="2026-03-13 10:11:29.233621154 +0000 UTC m=+3571.963521345" observedRunningTime="2026-03-13 10:11:29.695709031 +0000 UTC m=+3572.425609242" watchObservedRunningTime="2026-03-13 10:11:29.701366249 +0000 UTC m=+3572.431266440" Mar 13 10:11:35 crc kubenswrapper[4841]: I0313 10:11:35.622091 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vm22p" Mar 13 10:11:35 crc kubenswrapper[4841]: I0313 10:11:35.622707 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vm22p" Mar 13 10:11:35 crc kubenswrapper[4841]: I0313 10:11:35.677070 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vm22p" Mar 13 10:11:35 crc kubenswrapper[4841]: I0313 10:11:35.788222 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vm22p" Mar 13 10:11:35 crc kubenswrapper[4841]: I0313 10:11:35.924976 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vm22p"] Mar 13 10:11:37 crc kubenswrapper[4841]: I0313 10:11:37.750860 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vm22p" podUID="ccab1406-028c-4f6b-a217-2be39e3d5cd4" containerName="registry-server" containerID="cri-o://ecaf564b619291f4f3859d95ecf7a81df2432702b1e87481e707b17e13e57714" gracePeriod=2 Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.727378 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vm22p" Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.761165 4841 generic.go:334] "Generic (PLEG): container finished" podID="ccab1406-028c-4f6b-a217-2be39e3d5cd4" containerID="ecaf564b619291f4f3859d95ecf7a81df2432702b1e87481e707b17e13e57714" exitCode=0 Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.761206 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vm22p" event={"ID":"ccab1406-028c-4f6b-a217-2be39e3d5cd4","Type":"ContainerDied","Data":"ecaf564b619291f4f3859d95ecf7a81df2432702b1e87481e707b17e13e57714"} Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.761230 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vm22p" event={"ID":"ccab1406-028c-4f6b-a217-2be39e3d5cd4","Type":"ContainerDied","Data":"893f037b3a3e379c166e02ed10a1c56dd0c39ea1a2f3c87d2d75af7b00490f49"} Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.761244 4841 scope.go:117] "RemoveContainer" containerID="ecaf564b619291f4f3859d95ecf7a81df2432702b1e87481e707b17e13e57714" Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.761365 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vm22p" Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.808833 4841 scope.go:117] "RemoveContainer" containerID="58e333fb1909d3b8e3da75fd4e5c7104b11eca798c3afd2a768c9de452d3224f" Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.830629 4841 scope.go:117] "RemoveContainer" containerID="079b5d16272082bc6e324daf52cfb55287976cd415501a15dc46ead068999460" Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.850673 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccab1406-028c-4f6b-a217-2be39e3d5cd4-utilities\") pod \"ccab1406-028c-4f6b-a217-2be39e3d5cd4\" (UID: \"ccab1406-028c-4f6b-a217-2be39e3d5cd4\") " Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.850828 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvjhz\" (UniqueName: \"kubernetes.io/projected/ccab1406-028c-4f6b-a217-2be39e3d5cd4-kube-api-access-hvjhz\") pod \"ccab1406-028c-4f6b-a217-2be39e3d5cd4\" (UID: \"ccab1406-028c-4f6b-a217-2be39e3d5cd4\") " Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.850878 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccab1406-028c-4f6b-a217-2be39e3d5cd4-catalog-content\") pod \"ccab1406-028c-4f6b-a217-2be39e3d5cd4\" (UID: \"ccab1406-028c-4f6b-a217-2be39e3d5cd4\") " Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.851725 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccab1406-028c-4f6b-a217-2be39e3d5cd4-utilities" (OuterVolumeSpecName: "utilities") pod "ccab1406-028c-4f6b-a217-2be39e3d5cd4" (UID: "ccab1406-028c-4f6b-a217-2be39e3d5cd4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.859763 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccab1406-028c-4f6b-a217-2be39e3d5cd4-kube-api-access-hvjhz" (OuterVolumeSpecName: "kube-api-access-hvjhz") pod "ccab1406-028c-4f6b-a217-2be39e3d5cd4" (UID: "ccab1406-028c-4f6b-a217-2be39e3d5cd4"). InnerVolumeSpecName "kube-api-access-hvjhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.882664 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccab1406-028c-4f6b-a217-2be39e3d5cd4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ccab1406-028c-4f6b-a217-2be39e3d5cd4" (UID: "ccab1406-028c-4f6b-a217-2be39e3d5cd4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.922536 4841 scope.go:117] "RemoveContainer" containerID="ecaf564b619291f4f3859d95ecf7a81df2432702b1e87481e707b17e13e57714" Mar 13 10:11:38 crc kubenswrapper[4841]: E0313 10:11:38.923139 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecaf564b619291f4f3859d95ecf7a81df2432702b1e87481e707b17e13e57714\": container with ID starting with ecaf564b619291f4f3859d95ecf7a81df2432702b1e87481e707b17e13e57714 not found: ID does not exist" containerID="ecaf564b619291f4f3859d95ecf7a81df2432702b1e87481e707b17e13e57714" Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.923183 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecaf564b619291f4f3859d95ecf7a81df2432702b1e87481e707b17e13e57714"} err="failed to get container status \"ecaf564b619291f4f3859d95ecf7a81df2432702b1e87481e707b17e13e57714\": rpc error: code = NotFound desc = could not find container \"ecaf564b619291f4f3859d95ecf7a81df2432702b1e87481e707b17e13e57714\": container with ID starting with ecaf564b619291f4f3859d95ecf7a81df2432702b1e87481e707b17e13e57714 not found: ID does not exist" Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.923209 4841 scope.go:117] "RemoveContainer" containerID="58e333fb1909d3b8e3da75fd4e5c7104b11eca798c3afd2a768c9de452d3224f" Mar 13 10:11:38 crc kubenswrapper[4841]: E0313 10:11:38.923675 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58e333fb1909d3b8e3da75fd4e5c7104b11eca798c3afd2a768c9de452d3224f\": container with ID starting with 58e333fb1909d3b8e3da75fd4e5c7104b11eca798c3afd2a768c9de452d3224f not found: ID does not exist" containerID="58e333fb1909d3b8e3da75fd4e5c7104b11eca798c3afd2a768c9de452d3224f" Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.923727 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58e333fb1909d3b8e3da75fd4e5c7104b11eca798c3afd2a768c9de452d3224f"} err="failed to get container status \"58e333fb1909d3b8e3da75fd4e5c7104b11eca798c3afd2a768c9de452d3224f\": rpc error: code = NotFound desc = could not find container \"58e333fb1909d3b8e3da75fd4e5c7104b11eca798c3afd2a768c9de452d3224f\": container with ID starting with 58e333fb1909d3b8e3da75fd4e5c7104b11eca798c3afd2a768c9de452d3224f not found: ID does not exist" Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.923770 4841 scope.go:117] "RemoveContainer" containerID="079b5d16272082bc6e324daf52cfb55287976cd415501a15dc46ead068999460" Mar 13 10:11:38 crc kubenswrapper[4841]: E0313 10:11:38.924241 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"079b5d16272082bc6e324daf52cfb55287976cd415501a15dc46ead068999460\": container with ID starting with 079b5d16272082bc6e324daf52cfb55287976cd415501a15dc46ead068999460 not found: ID does not exist" containerID="079b5d16272082bc6e324daf52cfb55287976cd415501a15dc46ead068999460" Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.924290 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"079b5d16272082bc6e324daf52cfb55287976cd415501a15dc46ead068999460"} err="failed to get container status \"079b5d16272082bc6e324daf52cfb55287976cd415501a15dc46ead068999460\": rpc error: code = NotFound desc = could not find container \"079b5d16272082bc6e324daf52cfb55287976cd415501a15dc46ead068999460\": container with ID starting with 079b5d16272082bc6e324daf52cfb55287976cd415501a15dc46ead068999460 not found: ID does not exist" Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.953801 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccab1406-028c-4f6b-a217-2be39e3d5cd4-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.953842 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvjhz\" (UniqueName: \"kubernetes.io/projected/ccab1406-028c-4f6b-a217-2be39e3d5cd4-kube-api-access-hvjhz\") on node \"crc\" DevicePath \"\"" Mar 13 10:11:38 crc kubenswrapper[4841]: I0313 10:11:38.953857 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccab1406-028c-4f6b-a217-2be39e3d5cd4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 10:11:39 crc kubenswrapper[4841]: I0313 10:11:39.103364 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vm22p"] Mar 13 10:11:39 crc kubenswrapper[4841]: I0313 10:11:39.127410 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vm22p"] Mar 13 10:11:40 crc kubenswrapper[4841]: I0313 10:11:40.006813 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccab1406-028c-4f6b-a217-2be39e3d5cd4" path="/var/lib/kubelet/pods/ccab1406-028c-4f6b-a217-2be39e3d5cd4/volumes" Mar 13 10:12:00 crc kubenswrapper[4841]: I0313 10:12:00.179209 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556612-t87rd"] Mar 13 10:12:00 crc kubenswrapper[4841]: E0313 10:12:00.180503 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccab1406-028c-4f6b-a217-2be39e3d5cd4" containerName="extract-utilities" Mar 13 10:12:00 crc kubenswrapper[4841]: I0313 10:12:00.180524 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccab1406-028c-4f6b-a217-2be39e3d5cd4" containerName="extract-utilities" Mar 13 10:12:00 crc kubenswrapper[4841]: E0313 10:12:00.180549 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccab1406-028c-4f6b-a217-2be39e3d5cd4" containerName="registry-server" Mar 13 10:12:00 crc kubenswrapper[4841]: I0313 10:12:00.180557 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccab1406-028c-4f6b-a217-2be39e3d5cd4" containerName="registry-server" Mar 13 10:12:00 crc kubenswrapper[4841]: E0313 10:12:00.180598 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccab1406-028c-4f6b-a217-2be39e3d5cd4" containerName="extract-content" Mar 13 10:12:00 crc kubenswrapper[4841]: I0313 10:12:00.180606 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccab1406-028c-4f6b-a217-2be39e3d5cd4" containerName="extract-content" Mar 13 10:12:00 crc kubenswrapper[4841]: I0313 10:12:00.180834 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccab1406-028c-4f6b-a217-2be39e3d5cd4" containerName="registry-server" Mar 13 10:12:00 crc kubenswrapper[4841]: I0313 10:12:00.181724 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556612-t87rd" Mar 13 10:12:00 crc kubenswrapper[4841]: I0313 10:12:00.181755 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556612-t87rd"] Mar 13 10:12:00 crc kubenswrapper[4841]: I0313 10:12:00.184210 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 10:12:00 crc kubenswrapper[4841]: I0313 10:12:00.184333 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 10:12:00 crc kubenswrapper[4841]: I0313 10:12:00.184540 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 10:12:00 crc kubenswrapper[4841]: I0313 10:12:00.218339 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct7vt\" (UniqueName: \"kubernetes.io/projected/c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47-kube-api-access-ct7vt\") pod \"auto-csr-approver-29556612-t87rd\" (UID: \"c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47\") " pod="openshift-infra/auto-csr-approver-29556612-t87rd" Mar 13 10:12:00 crc kubenswrapper[4841]: I0313 10:12:00.319307 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct7vt\" (UniqueName: \"kubernetes.io/projected/c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47-kube-api-access-ct7vt\") pod \"auto-csr-approver-29556612-t87rd\" (UID: \"c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47\") " pod="openshift-infra/auto-csr-approver-29556612-t87rd" Mar 13 10:12:00 crc kubenswrapper[4841]: I0313 10:12:00.339242 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct7vt\" (UniqueName: \"kubernetes.io/projected/c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47-kube-api-access-ct7vt\") pod \"auto-csr-approver-29556612-t87rd\" (UID: \"c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47\") " pod="openshift-infra/auto-csr-approver-29556612-t87rd" Mar 13 10:12:00 crc kubenswrapper[4841]: I0313 10:12:00.501937 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556612-t87rd" Mar 13 10:12:01 crc kubenswrapper[4841]: W0313 10:12:00.996348 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9c7f87e_f9dd_47d4_ab5a_f6cb551b8a47.slice/crio-3ffda2bd8c8a4a8f8d44cd6c2e4e789c97211d4adfeb1e1fcb2a23f206dc697f WatchSource:0}: Error finding container 3ffda2bd8c8a4a8f8d44cd6c2e4e789c97211d4adfeb1e1fcb2a23f206dc697f: Status 404 returned error can't find the container with id 3ffda2bd8c8a4a8f8d44cd6c2e4e789c97211d4adfeb1e1fcb2a23f206dc697f Mar 13 10:12:01 crc kubenswrapper[4841]: I0313 10:12:01.009087 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556612-t87rd"] Mar 13 10:12:02 crc kubenswrapper[4841]: I0313 10:12:02.167057 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556612-t87rd" event={"ID":"c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47","Type":"ContainerStarted","Data":"3ffda2bd8c8a4a8f8d44cd6c2e4e789c97211d4adfeb1e1fcb2a23f206dc697f"} Mar 13 10:12:03 crc kubenswrapper[4841]: I0313 10:12:03.175848 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556612-t87rd" event={"ID":"c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47","Type":"ContainerStarted","Data":"1be61b2c7846f52068adff0f29788801ce91efa26ab211270145532850a99392"} Mar 13 10:12:03 crc kubenswrapper[4841]: I0313 10:12:03.192816 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556612-t87rd" podStartSLOduration=1.69741431 podStartE2EDuration="3.192797223s" podCreationTimestamp="2026-03-13 10:12:00 +0000 UTC" firstStartedPulling="2026-03-13 10:12:01.003086722 +0000 UTC m=+3603.732986933" lastFinishedPulling="2026-03-13 10:12:02.498469655 +0000 UTC m=+3605.228369846" observedRunningTime="2026-03-13 10:12:03.188426695 +0000 UTC m=+3605.918326886" watchObservedRunningTime="2026-03-13 10:12:03.192797223 +0000 UTC m=+3605.922697414" Mar 13 10:12:04 crc kubenswrapper[4841]: I0313 10:12:04.187311 4841 generic.go:334] "Generic (PLEG): container finished" podID="c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47" containerID="1be61b2c7846f52068adff0f29788801ce91efa26ab211270145532850a99392" exitCode=0 Mar 13 10:12:04 crc kubenswrapper[4841]: I0313 10:12:04.187380 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556612-t87rd" event={"ID":"c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47","Type":"ContainerDied","Data":"1be61b2c7846f52068adff0f29788801ce91efa26ab211270145532850a99392"} Mar 13 10:12:04 crc kubenswrapper[4841]: I0313 10:12:04.407648 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 10:12:04 crc kubenswrapper[4841]: I0313 10:12:04.407732 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 10:12:05 crc kubenswrapper[4841]: I0313 10:12:05.570889 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556612-t87rd" Mar 13 10:12:05 crc kubenswrapper[4841]: I0313 10:12:05.721853 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct7vt\" (UniqueName: \"kubernetes.io/projected/c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47-kube-api-access-ct7vt\") pod \"c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47\" (UID: \"c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47\") " Mar 13 10:12:05 crc kubenswrapper[4841]: I0313 10:12:05.730622 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47-kube-api-access-ct7vt" (OuterVolumeSpecName: "kube-api-access-ct7vt") pod "c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47" (UID: "c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47"). InnerVolumeSpecName "kube-api-access-ct7vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:12:05 crc kubenswrapper[4841]: I0313 10:12:05.823889 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct7vt\" (UniqueName: \"kubernetes.io/projected/c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47-kube-api-access-ct7vt\") on node \"crc\" DevicePath \"\"" Mar 13 10:12:06 crc kubenswrapper[4841]: I0313 10:12:06.218703 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556612-t87rd" event={"ID":"c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47","Type":"ContainerDied","Data":"3ffda2bd8c8a4a8f8d44cd6c2e4e789c97211d4adfeb1e1fcb2a23f206dc697f"} Mar 13 10:12:06 crc kubenswrapper[4841]: I0313 10:12:06.219116 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ffda2bd8c8a4a8f8d44cd6c2e4e789c97211d4adfeb1e1fcb2a23f206dc697f" Mar 13 10:12:06 crc kubenswrapper[4841]: I0313 10:12:06.218815 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556612-t87rd" Mar 13 10:12:06 crc kubenswrapper[4841]: I0313 10:12:06.270177 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556606-q6bf4"] Mar 13 10:12:06 crc kubenswrapper[4841]: I0313 10:12:06.283720 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556606-q6bf4"] Mar 13 10:12:08 crc kubenswrapper[4841]: I0313 10:12:08.008385 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="310c034a-4453-4712-b8c6-cd6ab0ff5ab7" path="/var/lib/kubelet/pods/310c034a-4453-4712-b8c6-cd6ab0ff5ab7/volumes" Mar 13 10:12:34 crc kubenswrapper[4841]: I0313 10:12:34.407175 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 10:12:34 crc kubenswrapper[4841]: I0313 10:12:34.407829 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 10:12:39 crc kubenswrapper[4841]: I0313 10:12:39.511273 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-8485bdb9db-mf5lp_17824e5f-18b3-46c0-910a-56e5529e09c3/manager/0.log" Mar 13 10:12:57 crc kubenswrapper[4841]: I0313 10:12:57.262904 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nq5bz/must-gather-cjjq4"] Mar 13 10:12:57 crc kubenswrapper[4841]: E0313 10:12:57.264071 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47" containerName="oc" Mar 13 10:12:57 crc kubenswrapper[4841]: I0313 10:12:57.264089 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47" containerName="oc" Mar 13 10:12:57 crc kubenswrapper[4841]: I0313 10:12:57.264383 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47" containerName="oc" Mar 13 10:12:57 crc kubenswrapper[4841]: I0313 10:12:57.265812 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nq5bz/must-gather-cjjq4" Mar 13 10:12:57 crc kubenswrapper[4841]: I0313 10:12:57.270688 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nq5bz"/"openshift-service-ca.crt" Mar 13 10:12:57 crc kubenswrapper[4841]: I0313 10:12:57.271242 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-nq5bz"/"default-dockercfg-hnwlf" Mar 13 10:12:57 crc kubenswrapper[4841]: I0313 10:12:57.272457 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nq5bz"/"kube-root-ca.crt" Mar 13 10:12:57 crc kubenswrapper[4841]: I0313 10:12:57.276327 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nq5bz/must-gather-cjjq4"] Mar 13 10:12:57 crc kubenswrapper[4841]: I0313 10:12:57.322173 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8697567-4038-490d-8525-0ee6f26e6508-must-gather-output\") pod \"must-gather-cjjq4\" (UID: \"b8697567-4038-490d-8525-0ee6f26e6508\") " pod="openshift-must-gather-nq5bz/must-gather-cjjq4" Mar 13 10:12:57 crc kubenswrapper[4841]: I0313 10:12:57.322321 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8jb2\" (UniqueName: \"kubernetes.io/projected/b8697567-4038-490d-8525-0ee6f26e6508-kube-api-access-m8jb2\") pod \"must-gather-cjjq4\" (UID: \"b8697567-4038-490d-8525-0ee6f26e6508\") " pod="openshift-must-gather-nq5bz/must-gather-cjjq4" Mar 13 10:12:57 crc kubenswrapper[4841]: I0313 10:12:57.423651 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8697567-4038-490d-8525-0ee6f26e6508-must-gather-output\") pod \"must-gather-cjjq4\" (UID: \"b8697567-4038-490d-8525-0ee6f26e6508\") " pod="openshift-must-gather-nq5bz/must-gather-cjjq4" Mar 13 10:12:57 crc kubenswrapper[4841]: I0313 10:12:57.423737 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8jb2\" (UniqueName: \"kubernetes.io/projected/b8697567-4038-490d-8525-0ee6f26e6508-kube-api-access-m8jb2\") pod \"must-gather-cjjq4\" (UID: \"b8697567-4038-490d-8525-0ee6f26e6508\") " pod="openshift-must-gather-nq5bz/must-gather-cjjq4" Mar 13 10:12:57 crc kubenswrapper[4841]: I0313 10:12:57.424116 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8697567-4038-490d-8525-0ee6f26e6508-must-gather-output\") pod \"must-gather-cjjq4\" (UID: \"b8697567-4038-490d-8525-0ee6f26e6508\") " pod="openshift-must-gather-nq5bz/must-gather-cjjq4" Mar 13 10:12:57 crc kubenswrapper[4841]: I0313 10:12:57.446933 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8jb2\" (UniqueName: \"kubernetes.io/projected/b8697567-4038-490d-8525-0ee6f26e6508-kube-api-access-m8jb2\") pod \"must-gather-cjjq4\" (UID: \"b8697567-4038-490d-8525-0ee6f26e6508\") " pod="openshift-must-gather-nq5bz/must-gather-cjjq4" Mar 13 10:12:57 crc kubenswrapper[4841]: I0313 10:12:57.585818 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nq5bz/must-gather-cjjq4" Mar 13 10:12:58 crc kubenswrapper[4841]: I0313 10:12:58.103861 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 10:12:58 crc kubenswrapper[4841]: I0313 10:12:58.104810 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nq5bz/must-gather-cjjq4"] Mar 13 10:12:58 crc kubenswrapper[4841]: I0313 10:12:58.815887 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nq5bz/must-gather-cjjq4" event={"ID":"b8697567-4038-490d-8525-0ee6f26e6508","Type":"ContainerStarted","Data":"eb3f778147b664b5a48611abd69395973e8468546df759e589555a9b3c5359fb"} Mar 13 10:13:04 crc kubenswrapper[4841]: I0313 10:13:04.407670 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 10:13:04 crc kubenswrapper[4841]: I0313 10:13:04.408165 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 10:13:04 crc kubenswrapper[4841]: I0313 10:13:04.408211 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 10:13:04 crc kubenswrapper[4841]: I0313 10:13:04.409037 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912"} pod="openshift-machine-config-operator/machine-config-daemon-h227v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 10:13:04 crc kubenswrapper[4841]: I0313 10:13:04.409098 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" containerID="cri-o://a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" gracePeriod=600 Mar 13 10:13:04 crc kubenswrapper[4841]: E0313 10:13:04.531913 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:13:04 crc kubenswrapper[4841]: I0313 10:13:04.874511 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nq5bz/must-gather-cjjq4" event={"ID":"b8697567-4038-490d-8525-0ee6f26e6508","Type":"ContainerStarted","Data":"d127eb65df32f43d439864e2c7cac35f67934e6296a4ec66a19b93ba8716a309"} Mar 13 10:13:04 crc kubenswrapper[4841]: I0313 10:13:04.876939 4841 generic.go:334] "Generic (PLEG): container finished" podID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" exitCode=0 Mar 13 10:13:04 crc kubenswrapper[4841]: I0313 10:13:04.876974 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerDied","Data":"a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912"} Mar 13 10:13:04 crc kubenswrapper[4841]: I0313 10:13:04.876999 4841 scope.go:117] "RemoveContainer" containerID="345fb0919d4b00a63b8564b52ef5b612a0955f49c27fe5b9f30c8993e805394a" Mar 13 10:13:04 crc kubenswrapper[4841]: I0313 10:13:04.877733 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:13:04 crc kubenswrapper[4841]: E0313 10:13:04.878144 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:13:05 crc kubenswrapper[4841]: I0313 10:13:05.887620 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nq5bz/must-gather-cjjq4" event={"ID":"b8697567-4038-490d-8525-0ee6f26e6508","Type":"ContainerStarted","Data":"55c8a845ceffa5b105e41c7504d67fade846ecd0b00e324b258e5ceecec435be"} Mar 13 10:13:05 crc kubenswrapper[4841]: I0313 10:13:05.914259 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nq5bz/must-gather-cjjq4" podStartSLOduration=2.500335845 podStartE2EDuration="8.914164172s" podCreationTimestamp="2026-03-13 10:12:57 +0000 UTC" firstStartedPulling="2026-03-13 10:12:58.103612341 +0000 UTC m=+3660.833512552" lastFinishedPulling="2026-03-13 10:13:04.517440688 +0000 UTC m=+3667.247340879" observedRunningTime="2026-03-13 10:13:05.902116895 +0000 UTC m=+3668.632017086" watchObservedRunningTime="2026-03-13 10:13:05.914164172 +0000 UTC m=+3668.644064393" Mar 13 10:13:06 crc kubenswrapper[4841]: I0313 10:13:06.466030 4841 scope.go:117] "RemoveContainer" containerID="8f9f584ed48aa030cfb191ada88a9a3d3295ab13ffd06c3fa9c956721d9a7699" Mar 13 10:13:09 crc kubenswrapper[4841]: I0313 10:13:09.883770 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nq5bz/crc-debug-hbmtf"] Mar 13 10:13:09 crc kubenswrapper[4841]: I0313 10:13:09.885523 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nq5bz/crc-debug-hbmtf" Mar 13 10:13:09 crc kubenswrapper[4841]: I0313 10:13:09.988475 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rh5v\" (UniqueName: \"kubernetes.io/projected/77b48c0d-729e-4b8d-94eb-5836b75c87ab-kube-api-access-5rh5v\") pod \"crc-debug-hbmtf\" (UID: \"77b48c0d-729e-4b8d-94eb-5836b75c87ab\") " pod="openshift-must-gather-nq5bz/crc-debug-hbmtf" Mar 13 10:13:09 crc kubenswrapper[4841]: I0313 10:13:09.988632 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77b48c0d-729e-4b8d-94eb-5836b75c87ab-host\") pod \"crc-debug-hbmtf\" (UID: \"77b48c0d-729e-4b8d-94eb-5836b75c87ab\") " pod="openshift-must-gather-nq5bz/crc-debug-hbmtf" Mar 13 10:13:10 crc kubenswrapper[4841]: I0313 10:13:10.091638 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77b48c0d-729e-4b8d-94eb-5836b75c87ab-host\") pod \"crc-debug-hbmtf\" (UID: \"77b48c0d-729e-4b8d-94eb-5836b75c87ab\") " pod="openshift-must-gather-nq5bz/crc-debug-hbmtf" Mar 13 10:13:10 crc kubenswrapper[4841]: I0313 10:13:10.091775 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77b48c0d-729e-4b8d-94eb-5836b75c87ab-host\") pod \"crc-debug-hbmtf\" (UID: \"77b48c0d-729e-4b8d-94eb-5836b75c87ab\") " pod="openshift-must-gather-nq5bz/crc-debug-hbmtf" Mar 13 10:13:10 crc kubenswrapper[4841]: I0313 10:13:10.093522 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rh5v\" (UniqueName: \"kubernetes.io/projected/77b48c0d-729e-4b8d-94eb-5836b75c87ab-kube-api-access-5rh5v\") pod \"crc-debug-hbmtf\" (UID: \"77b48c0d-729e-4b8d-94eb-5836b75c87ab\") " pod="openshift-must-gather-nq5bz/crc-debug-hbmtf" Mar 13 10:13:10 crc kubenswrapper[4841]: I0313 10:13:10.115001 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rh5v\" (UniqueName: \"kubernetes.io/projected/77b48c0d-729e-4b8d-94eb-5836b75c87ab-kube-api-access-5rh5v\") pod \"crc-debug-hbmtf\" (UID: \"77b48c0d-729e-4b8d-94eb-5836b75c87ab\") " pod="openshift-must-gather-nq5bz/crc-debug-hbmtf" Mar 13 10:13:10 crc kubenswrapper[4841]: I0313 10:13:10.207218 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nq5bz/crc-debug-hbmtf" Mar 13 10:13:10 crc kubenswrapper[4841]: I0313 10:13:10.944525 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nq5bz/crc-debug-hbmtf" event={"ID":"77b48c0d-729e-4b8d-94eb-5836b75c87ab","Type":"ContainerStarted","Data":"e58ff4de4168e5c3fa0394bc2e1c95e05ff74a4cf6c321fa196208360cea177a"} Mar 13 10:13:15 crc kubenswrapper[4841]: I0313 10:13:15.995126 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:13:15 crc kubenswrapper[4841]: E0313 10:13:15.996023 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:13:22 crc kubenswrapper[4841]: I0313 10:13:22.073920 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nq5bz/crc-debug-hbmtf" event={"ID":"77b48c0d-729e-4b8d-94eb-5836b75c87ab","Type":"ContainerStarted","Data":"9457c7084f31cda0146952e3f847dfda1d760a343f79055b2d4e48a81e85ddd9"} Mar 13 10:13:22 crc kubenswrapper[4841]: I0313 10:13:22.098579 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nq5bz/crc-debug-hbmtf" podStartSLOduration=2.292597226 podStartE2EDuration="13.0985612s" podCreationTimestamp="2026-03-13 10:13:09 +0000 UTC" firstStartedPulling="2026-03-13 10:13:10.247379366 +0000 UTC m=+3672.977279567" lastFinishedPulling="2026-03-13 10:13:21.05334335 +0000 UTC m=+3683.783243541" observedRunningTime="2026-03-13 10:13:22.095840205 +0000 UTC m=+3684.825740396" watchObservedRunningTime="2026-03-13 10:13:22.0985612 +0000 UTC m=+3684.828461391" Mar 13 10:13:28 crc kubenswrapper[4841]: I0313 10:13:28.001984 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:13:28 crc kubenswrapper[4841]: E0313 10:13:28.002608 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:13:37 crc kubenswrapper[4841]: I0313 10:13:37.229144 4841 generic.go:334] "Generic (PLEG): container finished" podID="77b48c0d-729e-4b8d-94eb-5836b75c87ab" containerID="9457c7084f31cda0146952e3f847dfda1d760a343f79055b2d4e48a81e85ddd9" exitCode=0 Mar 13 10:13:37 crc kubenswrapper[4841]: I0313 10:13:37.229232 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nq5bz/crc-debug-hbmtf" event={"ID":"77b48c0d-729e-4b8d-94eb-5836b75c87ab","Type":"ContainerDied","Data":"9457c7084f31cda0146952e3f847dfda1d760a343f79055b2d4e48a81e85ddd9"} Mar 13 10:13:38 crc kubenswrapper[4841]: I0313 10:13:38.343986 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nq5bz/crc-debug-hbmtf" Mar 13 10:13:38 crc kubenswrapper[4841]: I0313 10:13:38.372130 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nq5bz/crc-debug-hbmtf"] Mar 13 10:13:38 crc kubenswrapper[4841]: I0313 10:13:38.380860 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nq5bz/crc-debug-hbmtf"] Mar 13 10:13:38 crc kubenswrapper[4841]: I0313 10:13:38.469512 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77b48c0d-729e-4b8d-94eb-5836b75c87ab-host\") pod \"77b48c0d-729e-4b8d-94eb-5836b75c87ab\" (UID: \"77b48c0d-729e-4b8d-94eb-5836b75c87ab\") " Mar 13 10:13:38 crc kubenswrapper[4841]: I0313 10:13:38.469605 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77b48c0d-729e-4b8d-94eb-5836b75c87ab-host" (OuterVolumeSpecName: "host") pod "77b48c0d-729e-4b8d-94eb-5836b75c87ab" (UID: "77b48c0d-729e-4b8d-94eb-5836b75c87ab"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:13:38 crc kubenswrapper[4841]: I0313 10:13:38.469825 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rh5v\" (UniqueName: \"kubernetes.io/projected/77b48c0d-729e-4b8d-94eb-5836b75c87ab-kube-api-access-5rh5v\") pod \"77b48c0d-729e-4b8d-94eb-5836b75c87ab\" (UID: \"77b48c0d-729e-4b8d-94eb-5836b75c87ab\") " Mar 13 10:13:38 crc kubenswrapper[4841]: I0313 10:13:38.471175 4841 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77b48c0d-729e-4b8d-94eb-5836b75c87ab-host\") on node \"crc\" DevicePath \"\"" Mar 13 10:13:38 crc kubenswrapper[4841]: I0313 10:13:38.475419 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77b48c0d-729e-4b8d-94eb-5836b75c87ab-kube-api-access-5rh5v" (OuterVolumeSpecName: "kube-api-access-5rh5v") pod "77b48c0d-729e-4b8d-94eb-5836b75c87ab" (UID: "77b48c0d-729e-4b8d-94eb-5836b75c87ab"). InnerVolumeSpecName "kube-api-access-5rh5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:13:38 crc kubenswrapper[4841]: I0313 10:13:38.573328 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rh5v\" (UniqueName: \"kubernetes.io/projected/77b48c0d-729e-4b8d-94eb-5836b75c87ab-kube-api-access-5rh5v\") on node \"crc\" DevicePath \"\"" Mar 13 10:13:39 crc kubenswrapper[4841]: I0313 10:13:39.249276 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e58ff4de4168e5c3fa0394bc2e1c95e05ff74a4cf6c321fa196208360cea177a" Mar 13 10:13:39 crc kubenswrapper[4841]: I0313 10:13:39.249328 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nq5bz/crc-debug-hbmtf" Mar 13 10:13:39 crc kubenswrapper[4841]: I0313 10:13:39.546981 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nq5bz/crc-debug-ctn7x"] Mar 13 10:13:39 crc kubenswrapper[4841]: E0313 10:13:39.547472 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77b48c0d-729e-4b8d-94eb-5836b75c87ab" containerName="container-00" Mar 13 10:13:39 crc kubenswrapper[4841]: I0313 10:13:39.547489 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="77b48c0d-729e-4b8d-94eb-5836b75c87ab" containerName="container-00" Mar 13 10:13:39 crc kubenswrapper[4841]: I0313 10:13:39.547759 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="77b48c0d-729e-4b8d-94eb-5836b75c87ab" containerName="container-00" Mar 13 10:13:39 crc kubenswrapper[4841]: I0313 10:13:39.548571 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nq5bz/crc-debug-ctn7x" Mar 13 10:13:39 crc kubenswrapper[4841]: I0313 10:13:39.698742 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz96v\" (UniqueName: \"kubernetes.io/projected/215557ae-c27c-4cef-98ff-c19a2e9ac8f6-kube-api-access-cz96v\") pod \"crc-debug-ctn7x\" (UID: \"215557ae-c27c-4cef-98ff-c19a2e9ac8f6\") " pod="openshift-must-gather-nq5bz/crc-debug-ctn7x" Mar 13 10:13:39 crc kubenswrapper[4841]: I0313 10:13:39.699237 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/215557ae-c27c-4cef-98ff-c19a2e9ac8f6-host\") pod \"crc-debug-ctn7x\" (UID: \"215557ae-c27c-4cef-98ff-c19a2e9ac8f6\") " pod="openshift-must-gather-nq5bz/crc-debug-ctn7x" Mar 13 10:13:39 crc kubenswrapper[4841]: I0313 10:13:39.801847 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz96v\" (UniqueName: \"kubernetes.io/projected/215557ae-c27c-4cef-98ff-c19a2e9ac8f6-kube-api-access-cz96v\") pod \"crc-debug-ctn7x\" (UID: \"215557ae-c27c-4cef-98ff-c19a2e9ac8f6\") " pod="openshift-must-gather-nq5bz/crc-debug-ctn7x" Mar 13 10:13:39 crc kubenswrapper[4841]: I0313 10:13:39.802056 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/215557ae-c27c-4cef-98ff-c19a2e9ac8f6-host\") pod \"crc-debug-ctn7x\" (UID: \"215557ae-c27c-4cef-98ff-c19a2e9ac8f6\") " pod="openshift-must-gather-nq5bz/crc-debug-ctn7x" Mar 13 10:13:39 crc kubenswrapper[4841]: I0313 10:13:39.802383 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/215557ae-c27c-4cef-98ff-c19a2e9ac8f6-host\") pod \"crc-debug-ctn7x\" (UID: \"215557ae-c27c-4cef-98ff-c19a2e9ac8f6\") " pod="openshift-must-gather-nq5bz/crc-debug-ctn7x" Mar 13 10:13:39 crc kubenswrapper[4841]: I0313 10:13:39.827999 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz96v\" (UniqueName: \"kubernetes.io/projected/215557ae-c27c-4cef-98ff-c19a2e9ac8f6-kube-api-access-cz96v\") pod \"crc-debug-ctn7x\" (UID: \"215557ae-c27c-4cef-98ff-c19a2e9ac8f6\") " pod="openshift-must-gather-nq5bz/crc-debug-ctn7x" Mar 13 10:13:39 crc kubenswrapper[4841]: I0313 10:13:39.865292 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nq5bz/crc-debug-ctn7x" Mar 13 10:13:39 crc kubenswrapper[4841]: I0313 10:13:39.994543 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:13:39 crc kubenswrapper[4841]: E0313 10:13:39.995080 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:13:40 crc kubenswrapper[4841]: I0313 10:13:40.011018 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77b48c0d-729e-4b8d-94eb-5836b75c87ab" path="/var/lib/kubelet/pods/77b48c0d-729e-4b8d-94eb-5836b75c87ab/volumes" Mar 13 10:13:40 crc kubenswrapper[4841]: I0313 10:13:40.258970 4841 generic.go:334] "Generic (PLEG): container finished" podID="215557ae-c27c-4cef-98ff-c19a2e9ac8f6" containerID="fa26dd1ed2add2ce314f2036a305b4254db6a6069716a286748652917353b915" exitCode=1 Mar 13 10:13:40 crc kubenswrapper[4841]: I0313 10:13:40.259013 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nq5bz/crc-debug-ctn7x" event={"ID":"215557ae-c27c-4cef-98ff-c19a2e9ac8f6","Type":"ContainerDied","Data":"fa26dd1ed2add2ce314f2036a305b4254db6a6069716a286748652917353b915"} Mar 13 10:13:40 crc kubenswrapper[4841]: I0313 10:13:40.259042 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nq5bz/crc-debug-ctn7x" event={"ID":"215557ae-c27c-4cef-98ff-c19a2e9ac8f6","Type":"ContainerStarted","Data":"dcff78c8332b63cebaa37db65fa3471c24b69c053812165e317f04ac5e3879be"} Mar 13 10:13:40 crc kubenswrapper[4841]: I0313 10:13:40.317096 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nq5bz/crc-debug-ctn7x"] Mar 13 10:13:40 crc kubenswrapper[4841]: I0313 10:13:40.341777 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nq5bz/crc-debug-ctn7x"] Mar 13 10:13:41 crc kubenswrapper[4841]: I0313 10:13:41.381410 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nq5bz/crc-debug-ctn7x" Mar 13 10:13:41 crc kubenswrapper[4841]: I0313 10:13:41.539928 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/215557ae-c27c-4cef-98ff-c19a2e9ac8f6-host\") pod \"215557ae-c27c-4cef-98ff-c19a2e9ac8f6\" (UID: \"215557ae-c27c-4cef-98ff-c19a2e9ac8f6\") " Mar 13 10:13:41 crc kubenswrapper[4841]: I0313 10:13:41.540158 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/215557ae-c27c-4cef-98ff-c19a2e9ac8f6-host" (OuterVolumeSpecName: "host") pod "215557ae-c27c-4cef-98ff-c19a2e9ac8f6" (UID: "215557ae-c27c-4cef-98ff-c19a2e9ac8f6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:13:41 crc kubenswrapper[4841]: I0313 10:13:41.540599 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz96v\" (UniqueName: \"kubernetes.io/projected/215557ae-c27c-4cef-98ff-c19a2e9ac8f6-kube-api-access-cz96v\") pod \"215557ae-c27c-4cef-98ff-c19a2e9ac8f6\" (UID: \"215557ae-c27c-4cef-98ff-c19a2e9ac8f6\") " Mar 13 10:13:41 crc kubenswrapper[4841]: I0313 10:13:41.541211 4841 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/215557ae-c27c-4cef-98ff-c19a2e9ac8f6-host\") on node \"crc\" DevicePath \"\"" Mar 13 10:13:41 crc kubenswrapper[4841]: I0313 10:13:41.549469 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/215557ae-c27c-4cef-98ff-c19a2e9ac8f6-kube-api-access-cz96v" (OuterVolumeSpecName: "kube-api-access-cz96v") pod "215557ae-c27c-4cef-98ff-c19a2e9ac8f6" (UID: "215557ae-c27c-4cef-98ff-c19a2e9ac8f6"). InnerVolumeSpecName "kube-api-access-cz96v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:13:41 crc kubenswrapper[4841]: I0313 10:13:41.642961 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz96v\" (UniqueName: \"kubernetes.io/projected/215557ae-c27c-4cef-98ff-c19a2e9ac8f6-kube-api-access-cz96v\") on node \"crc\" DevicePath \"\"" Mar 13 10:13:42 crc kubenswrapper[4841]: I0313 10:13:42.006932 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="215557ae-c27c-4cef-98ff-c19a2e9ac8f6" path="/var/lib/kubelet/pods/215557ae-c27c-4cef-98ff-c19a2e9ac8f6/volumes" Mar 13 10:13:42 crc kubenswrapper[4841]: I0313 10:13:42.275848 4841 scope.go:117] "RemoveContainer" containerID="fa26dd1ed2add2ce314f2036a305b4254db6a6069716a286748652917353b915" Mar 13 10:13:42 crc kubenswrapper[4841]: I0313 10:13:42.275956 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nq5bz/crc-debug-ctn7x" Mar 13 10:13:54 crc kubenswrapper[4841]: I0313 10:13:54.995572 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:13:54 crc kubenswrapper[4841]: E0313 10:13:54.996355 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:14:00 crc kubenswrapper[4841]: I0313 10:14:00.166219 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556614-2z7sq"] Mar 13 10:14:00 crc kubenswrapper[4841]: E0313 10:14:00.167333 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="215557ae-c27c-4cef-98ff-c19a2e9ac8f6" containerName="container-00" Mar 13 10:14:00 crc kubenswrapper[4841]: I0313 10:14:00.167349 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="215557ae-c27c-4cef-98ff-c19a2e9ac8f6" containerName="container-00" Mar 13 10:14:00 crc kubenswrapper[4841]: I0313 10:14:00.167716 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="215557ae-c27c-4cef-98ff-c19a2e9ac8f6" containerName="container-00" Mar 13 10:14:00 crc kubenswrapper[4841]: I0313 10:14:00.168483 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556614-2z7sq" Mar 13 10:14:00 crc kubenswrapper[4841]: I0313 10:14:00.186339 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 10:14:00 crc kubenswrapper[4841]: I0313 10:14:00.186370 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 10:14:00 crc kubenswrapper[4841]: I0313 10:14:00.186701 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 10:14:00 crc kubenswrapper[4841]: I0313 10:14:00.188172 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556614-2z7sq"] Mar 13 10:14:00 crc kubenswrapper[4841]: I0313 10:14:00.294492 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqzds\" (UniqueName: \"kubernetes.io/projected/ca4f7dc4-cc42-48c5-8ec3-f925c6190002-kube-api-access-sqzds\") pod \"auto-csr-approver-29556614-2z7sq\" (UID: \"ca4f7dc4-cc42-48c5-8ec3-f925c6190002\") " pod="openshift-infra/auto-csr-approver-29556614-2z7sq" Mar 13 10:14:00 crc kubenswrapper[4841]: I0313 10:14:00.397514 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqzds\" (UniqueName: \"kubernetes.io/projected/ca4f7dc4-cc42-48c5-8ec3-f925c6190002-kube-api-access-sqzds\") pod \"auto-csr-approver-29556614-2z7sq\" (UID: \"ca4f7dc4-cc42-48c5-8ec3-f925c6190002\") " pod="openshift-infra/auto-csr-approver-29556614-2z7sq" Mar 13 10:14:00 crc kubenswrapper[4841]: I0313 10:14:00.418192 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqzds\" (UniqueName: \"kubernetes.io/projected/ca4f7dc4-cc42-48c5-8ec3-f925c6190002-kube-api-access-sqzds\") pod \"auto-csr-approver-29556614-2z7sq\" (UID: \"ca4f7dc4-cc42-48c5-8ec3-f925c6190002\") " pod="openshift-infra/auto-csr-approver-29556614-2z7sq" Mar 13 10:14:00 crc kubenswrapper[4841]: I0313 10:14:00.508775 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556614-2z7sq" Mar 13 10:14:01 crc kubenswrapper[4841]: I0313 10:14:01.077159 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556614-2z7sq"] Mar 13 10:14:01 crc kubenswrapper[4841]: I0313 10:14:01.440295 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556614-2z7sq" event={"ID":"ca4f7dc4-cc42-48c5-8ec3-f925c6190002","Type":"ContainerStarted","Data":"331ff0465f8bf3615be04632039149c263d7aa604e07993493dcb810075d1c31"} Mar 13 10:14:02 crc kubenswrapper[4841]: I0313 10:14:02.468710 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556614-2z7sq" event={"ID":"ca4f7dc4-cc42-48c5-8ec3-f925c6190002","Type":"ContainerStarted","Data":"fcb890ffbb4e0759af3fe2beca03965ea4ebbe0bed4873f68f37cbcb7df821dc"} Mar 13 10:14:02 crc kubenswrapper[4841]: I0313 10:14:02.490309 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556614-2z7sq" podStartSLOduration=1.4340139729999999 podStartE2EDuration="2.490287369s" podCreationTimestamp="2026-03-13 10:14:00 +0000 UTC" firstStartedPulling="2026-03-13 10:14:01.08351542 +0000 UTC m=+3723.813415621" lastFinishedPulling="2026-03-13 10:14:02.139788836 +0000 UTC m=+3724.869689017" observedRunningTime="2026-03-13 10:14:02.482312039 +0000 UTC m=+3725.212212260" watchObservedRunningTime="2026-03-13 10:14:02.490287369 +0000 UTC m=+3725.220187560" Mar 13 10:14:03 crc kubenswrapper[4841]: I0313 10:14:03.486483 4841 generic.go:334] "Generic (PLEG): container finished" podID="ca4f7dc4-cc42-48c5-8ec3-f925c6190002" containerID="fcb890ffbb4e0759af3fe2beca03965ea4ebbe0bed4873f68f37cbcb7df821dc" exitCode=0 Mar 13 10:14:03 crc kubenswrapper[4841]: I0313 10:14:03.486586 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556614-2z7sq" event={"ID":"ca4f7dc4-cc42-48c5-8ec3-f925c6190002","Type":"ContainerDied","Data":"fcb890ffbb4e0759af3fe2beca03965ea4ebbe0bed4873f68f37cbcb7df821dc"} Mar 13 10:14:04 crc kubenswrapper[4841]: I0313 10:14:04.875038 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556614-2z7sq" Mar 13 10:14:05 crc kubenswrapper[4841]: I0313 10:14:05.014021 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqzds\" (UniqueName: \"kubernetes.io/projected/ca4f7dc4-cc42-48c5-8ec3-f925c6190002-kube-api-access-sqzds\") pod \"ca4f7dc4-cc42-48c5-8ec3-f925c6190002\" (UID: \"ca4f7dc4-cc42-48c5-8ec3-f925c6190002\") " Mar 13 10:14:05 crc kubenswrapper[4841]: I0313 10:14:05.020156 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca4f7dc4-cc42-48c5-8ec3-f925c6190002-kube-api-access-sqzds" (OuterVolumeSpecName: "kube-api-access-sqzds") pod "ca4f7dc4-cc42-48c5-8ec3-f925c6190002" (UID: "ca4f7dc4-cc42-48c5-8ec3-f925c6190002"). InnerVolumeSpecName "kube-api-access-sqzds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:14:05 crc kubenswrapper[4841]: I0313 10:14:05.117232 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqzds\" (UniqueName: \"kubernetes.io/projected/ca4f7dc4-cc42-48c5-8ec3-f925c6190002-kube-api-access-sqzds\") on node \"crc\" DevicePath \"\"" Mar 13 10:14:05 crc kubenswrapper[4841]: I0313 10:14:05.511779 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556614-2z7sq" event={"ID":"ca4f7dc4-cc42-48c5-8ec3-f925c6190002","Type":"ContainerDied","Data":"331ff0465f8bf3615be04632039149c263d7aa604e07993493dcb810075d1c31"} Mar 13 10:14:05 crc kubenswrapper[4841]: I0313 10:14:05.511838 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="331ff0465f8bf3615be04632039149c263d7aa604e07993493dcb810075d1c31" Mar 13 10:14:05 crc kubenswrapper[4841]: I0313 10:14:05.511922 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556614-2z7sq" Mar 13 10:14:05 crc kubenswrapper[4841]: I0313 10:14:05.565084 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556608-rws5k"] Mar 13 10:14:05 crc kubenswrapper[4841]: I0313 10:14:05.579595 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556608-rws5k"] Mar 13 10:14:05 crc kubenswrapper[4841]: I0313 10:14:05.995482 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:14:05 crc kubenswrapper[4841]: E0313 10:14:05.995938 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:14:06 crc kubenswrapper[4841]: I0313 10:14:06.007125 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c" path="/var/lib/kubelet/pods/7fe9d78d-dc7d-441a-a2e4-4d372efd7d4c/volumes" Mar 13 10:14:06 crc kubenswrapper[4841]: I0313 10:14:06.560931 4841 scope.go:117] "RemoveContainer" containerID="d8a5d5c4f5531bd892325b3e9e4b7df0ac10af9f8c84faa6cd98c52191b75425" Mar 13 10:14:20 crc kubenswrapper[4841]: I0313 10:14:20.995183 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:14:20 crc kubenswrapper[4841]: E0313 10:14:20.996006 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:14:32 crc kubenswrapper[4841]: I0313 10:14:32.995257 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:14:32 crc kubenswrapper[4841]: E0313 10:14:32.996170 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:14:34 crc kubenswrapper[4841]: I0313 10:14:34.986310 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0812602d-3596-4cda-b90a-d2f76f67bf52/init-config-reloader/0.log" Mar 13 10:14:35 crc kubenswrapper[4841]: I0313 10:14:35.169423 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0812602d-3596-4cda-b90a-d2f76f67bf52/init-config-reloader/0.log" Mar 13 10:14:35 crc kubenswrapper[4841]: I0313 10:14:35.176582 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0812602d-3596-4cda-b90a-d2f76f67bf52/alertmanager/0.log" Mar 13 10:14:35 crc kubenswrapper[4841]: I0313 10:14:35.233416 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0812602d-3596-4cda-b90a-d2f76f67bf52/config-reloader/0.log" Mar 13 10:14:35 crc kubenswrapper[4841]: I0313 10:14:35.354966 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_46839f95-04c1-47d3-b63c-c9e2d80b681a/aodh-api/0.log" Mar 13 10:14:35 crc kubenswrapper[4841]: I0313 10:14:35.368137 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_46839f95-04c1-47d3-b63c-c9e2d80b681a/aodh-evaluator/0.log" Mar 13 10:14:35 crc kubenswrapper[4841]: I0313 10:14:35.439490 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_46839f95-04c1-47d3-b63c-c9e2d80b681a/aodh-listener/0.log" Mar 13 10:14:35 crc kubenswrapper[4841]: I0313 10:14:35.509480 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_46839f95-04c1-47d3-b63c-c9e2d80b681a/aodh-notifier/0.log" Mar 13 10:14:35 crc kubenswrapper[4841]: I0313 10:14:35.560866 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5dbfbd46f8-tjjrf_728289d9-1ed1-449a-99e7-85da0a025366/barbican-api/0.log" Mar 13 10:14:35 crc kubenswrapper[4841]: I0313 10:14:35.637445 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5dbfbd46f8-tjjrf_728289d9-1ed1-449a-99e7-85da0a025366/barbican-api-log/0.log" Mar 13 10:14:35 crc kubenswrapper[4841]: I0313 10:14:35.713935 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-d94989454-4npv2_0a624af3-f727-4d7e-8b59-6c45863bfcea/barbican-keystone-listener/0.log" Mar 13 10:14:35 crc kubenswrapper[4841]: I0313 10:14:35.773540 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-d94989454-4npv2_0a624af3-f727-4d7e-8b59-6c45863bfcea/barbican-keystone-listener-log/0.log" Mar 13 10:14:35 crc kubenswrapper[4841]: I0313 10:14:35.917772 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-fdcb67bff-tvnvp_6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd/barbican-worker/0.log" Mar 13 10:14:35 crc kubenswrapper[4841]: I0313 10:14:35.933204 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-fdcb67bff-tvnvp_6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd/barbican-worker-log/0.log" Mar 13 10:14:36 crc kubenswrapper[4841]: I0313 10:14:36.070220 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6_c3c0bc1a-b192-44f6-a237-9242d36513ce/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:14:36 crc kubenswrapper[4841]: I0313 10:14:36.134971 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_661cdaa4-34e4-47df-9bdb-95d67c012cff/ceilometer-central-agent/0.log" Mar 13 10:14:36 crc kubenswrapper[4841]: I0313 10:14:36.219695 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_661cdaa4-34e4-47df-9bdb-95d67c012cff/ceilometer-notification-agent/0.log" Mar 13 10:14:36 crc kubenswrapper[4841]: I0313 10:14:36.273693 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_661cdaa4-34e4-47df-9bdb-95d67c012cff/proxy-httpd/0.log" Mar 13 10:14:36 crc kubenswrapper[4841]: I0313 10:14:36.324135 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_661cdaa4-34e4-47df-9bdb-95d67c012cff/sg-core/0.log" Mar 13 10:14:36 crc kubenswrapper[4841]: I0313 10:14:36.447197 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_88dbe267-3d86-4bcd-8654-79392e0c502d/cinder-api/0.log" Mar 13 10:14:36 crc kubenswrapper[4841]: I0313 10:14:36.468514 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_88dbe267-3d86-4bcd-8654-79392e0c502d/cinder-api-log/0.log" Mar 13 10:14:36 crc kubenswrapper[4841]: I0313 10:14:36.680610 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_626bc701-ba99-4de7-a2f9-b42eb150a783/cinder-scheduler/0.log" Mar 13 10:14:36 crc kubenswrapper[4841]: I0313 10:14:36.739753 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_626bc701-ba99-4de7-a2f9-b42eb150a783/probe/0.log" Mar 13 10:14:36 crc kubenswrapper[4841]: I0313 10:14:36.984545 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-8smcd_59271a3d-6406-4e1f-a783-ba324ef8dece/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:14:37 crc kubenswrapper[4841]: I0313 10:14:37.159239 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p_f3c90f3c-6382-4a13-b4cd-515cfe68538e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:14:37 crc kubenswrapper[4841]: I0313 10:14:37.211059 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-6rf6l_2c632011-0a35-4eaa-a7f5-8d86466858ca/init/0.log" Mar 13 10:14:37 crc kubenswrapper[4841]: I0313 10:14:37.371843 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-6rf6l_2c632011-0a35-4eaa-a7f5-8d86466858ca/init/0.log" Mar 13 10:14:37 crc kubenswrapper[4841]: I0313 10:14:37.390115 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-6rf6l_2c632011-0a35-4eaa-a7f5-8d86466858ca/dnsmasq-dns/0.log" Mar 13 10:14:37 crc kubenswrapper[4841]: I0313 10:14:37.465397 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk_511f249d-a5ba-4a19-a5b6-16b5c75fe538/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:14:37 crc kubenswrapper[4841]: I0313 10:14:37.602174 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9/glance-httpd/0.log" Mar 13 10:14:37 crc kubenswrapper[4841]: I0313 10:14:37.652561 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9/glance-log/0.log" Mar 13 10:14:37 crc kubenswrapper[4841]: I0313 10:14:37.843091 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c0a786f1-b124-492e-80a6-6b7df2ad7bd3/glance-httpd/0.log" Mar 13 10:14:37 crc kubenswrapper[4841]: I0313 10:14:37.874074 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c0a786f1-b124-492e-80a6-6b7df2ad7bd3/glance-log/0.log" Mar 13 10:14:38 crc kubenswrapper[4841]: I0313 10:14:38.419465 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-5ff95dc669-rzhtr_595c0935-7197-4c48-be0d-8a3ad4d6442d/heat-api/0.log" Mar 13 10:14:38 crc kubenswrapper[4841]: I0313 10:14:38.446119 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-74f4d87c9f-bw7dr_f1ddc522-5255-4785-8d33-85a3d0e86af2/heat-engine/0.log" Mar 13 10:14:38 crc kubenswrapper[4841]: I0313 10:14:38.461530 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-fb97f87fc-9tb45_6b447cf0-3120-4329-9dbf-534fd45e70bf/heat-cfnapi/0.log" Mar 13 10:14:38 crc kubenswrapper[4841]: I0313 10:14:38.649203 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv_9904117c-604f-48b6-9f6b-ef60210b0a94/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:14:38 crc kubenswrapper[4841]: I0313 10:14:38.697687 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-szndw_6e611c0f-aa46-4280-ae2d-bdff4bf61b60/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:14:38 crc kubenswrapper[4841]: I0313 10:14:38.923445 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29556601-kkh2k_d8a76cc0-6588-4160-8580-766a47f207e6/keystone-cron/0.log" Mar 13 10:14:38 crc kubenswrapper[4841]: I0313 10:14:38.950030 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7b87d5dbb8-7ppv5_9ef13028-1aeb-4a08-b241-fa033413b353/keystone-api/0.log" Mar 13 10:14:39 crc kubenswrapper[4841]: I0313 10:14:39.158953 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8b082eb0-dc81-49f6-a313-07507e296c71/kube-state-metrics/0.log" Mar 13 10:14:39 crc kubenswrapper[4841]: I0313 10:14:39.191436 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq_4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:14:39 crc kubenswrapper[4841]: I0313 10:14:39.439260 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d768747c7-2ssnn_d209e4b8-27eb-4fea-ad65-807001e8638c/neutron-api/0.log" Mar 13 10:14:39 crc kubenswrapper[4841]: I0313 10:14:39.544892 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d768747c7-2ssnn_d209e4b8-27eb-4fea-ad65-807001e8638c/neutron-httpd/0.log" Mar 13 10:14:39 crc kubenswrapper[4841]: I0313 10:14:39.741648 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv_dc97298b-a706-488a-9ea1-e90de447c754/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:14:39 crc kubenswrapper[4841]: I0313 10:14:39.956044 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_7666c55c-f424-4bfd-a143-e768e534b721/nova-api-log/0.log" Mar 13 10:14:40 crc kubenswrapper[4841]: I0313 10:14:40.127589 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_7666c55c-f424-4bfd-a143-e768e534b721/nova-api-api/0.log" Mar 13 10:14:40 crc kubenswrapper[4841]: I0313 10:14:40.148209 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_96d3458e-2994-4e4a-97b9-738366b67d8e/nova-cell0-conductor-conductor/0.log" Mar 13 10:14:40 crc kubenswrapper[4841]: I0313 10:14:40.264902 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_8f70f1b2-8e4f-4738-9b35-2a5e75f92988/nova-cell1-conductor-conductor/0.log" Mar 13 10:14:40 crc kubenswrapper[4841]: I0313 10:14:40.427171 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_c1840df1-8c0f-4038-9389-eaf2bcc61705/nova-cell1-novncproxy-novncproxy/0.log" Mar 13 10:14:40 crc kubenswrapper[4841]: I0313 10:14:40.533086 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-v22pp_7f7ae341-a1c6-49f6-825c-4c47b14141f4/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:14:40 crc kubenswrapper[4841]: I0313 10:14:40.747681 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ede0b909-963d-4110-8d86-b09095cbd08c/nova-metadata-log/0.log" Mar 13 10:14:40 crc kubenswrapper[4841]: I0313 10:14:40.970773 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d3fdad05-7998-4fe3-a774-61cdaa01e27f/nova-scheduler-scheduler/0.log" Mar 13 10:14:41 crc kubenswrapper[4841]: I0313 10:14:41.039024 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_77aa8bf5-4386-4d85-8cca-75c90d5b2593/mysql-bootstrap/0.log" Mar 13 10:14:41 crc kubenswrapper[4841]: I0313 10:14:41.242649 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_77aa8bf5-4386-4d85-8cca-75c90d5b2593/galera/0.log" Mar 13 10:14:41 crc kubenswrapper[4841]: I0313 10:14:41.244556 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_77aa8bf5-4386-4d85-8cca-75c90d5b2593/mysql-bootstrap/0.log" Mar 13 10:14:41 crc kubenswrapper[4841]: I0313 10:14:41.415869 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_125cd366-c483-4efa-a55f-85b888bf6266/mysql-bootstrap/0.log" Mar 13 10:14:41 crc kubenswrapper[4841]: I0313 10:14:41.655955 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_125cd366-c483-4efa-a55f-85b888bf6266/mysql-bootstrap/0.log" Mar 13 10:14:41 crc kubenswrapper[4841]: I0313 10:14:41.730442 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_125cd366-c483-4efa-a55f-85b888bf6266/galera/0.log" Mar 13 10:14:41 crc kubenswrapper[4841]: I0313 10:14:41.824022 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2d802466-0b65-4820-865c-8ae969af527f/openstackclient/0.log" Mar 13 10:14:41 crc kubenswrapper[4841]: I0313 10:14:41.831031 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ede0b909-963d-4110-8d86-b09095cbd08c/nova-metadata-metadata/0.log" Mar 13 10:14:41 crc kubenswrapper[4841]: I0313 10:14:41.950112 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-bqlfl_b2bf634d-aa4f-4773-91ee-99616e217c82/ovn-controller/0.log" Mar 13 10:14:42 crc kubenswrapper[4841]: I0313 10:14:42.043885 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-66hkf_fb25dd49-bdd9-46c0-816f-5de963506142/openstack-network-exporter/0.log" Mar 13 10:14:42 crc kubenswrapper[4841]: I0313 10:14:42.248634 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-b2w62_84f12283-3c15-408e-a1a2-691c257434ca/ovsdb-server-init/0.log" Mar 13 10:14:42 crc kubenswrapper[4841]: I0313 10:14:42.364867 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-b2w62_84f12283-3c15-408e-a1a2-691c257434ca/ovs-vswitchd/0.log" Mar 13 10:14:42 crc kubenswrapper[4841]: I0313 10:14:42.404843 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-b2w62_84f12283-3c15-408e-a1a2-691c257434ca/ovsdb-server/0.log" Mar 13 10:14:42 crc kubenswrapper[4841]: I0313 10:14:42.405322 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-b2w62_84f12283-3c15-408e-a1a2-691c257434ca/ovsdb-server-init/0.log" Mar 13 10:14:42 crc kubenswrapper[4841]: I0313 10:14:42.573959 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_32dcacb9-78d3-4dd5-95e4-6d069bddc9e3/openstack-network-exporter/0.log" Mar 13 10:14:42 crc kubenswrapper[4841]: I0313 10:14:42.582681 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-59f4q_de039f4c-0550-4464-b901-a624fac40281/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:14:42 crc kubenswrapper[4841]: I0313 10:14:42.792379 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_32dcacb9-78d3-4dd5-95e4-6d069bddc9e3/ovn-northd/0.log" Mar 13 10:14:42 crc kubenswrapper[4841]: I0313 10:14:42.848024 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_984ac552-8ac1-4cbf-ada9-10a9dc02acd9/openstack-network-exporter/0.log" Mar 13 10:14:42 crc kubenswrapper[4841]: I0313 10:14:42.990540 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_984ac552-8ac1-4cbf-ada9-10a9dc02acd9/ovsdbserver-nb/0.log" Mar 13 10:14:43 crc kubenswrapper[4841]: I0313 10:14:43.036164 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5f3149e4-fc32-4773-ac07-785c8d11888e/openstack-network-exporter/0.log" Mar 13 10:14:43 crc kubenswrapper[4841]: I0313 10:14:43.076602 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5f3149e4-fc32-4773-ac07-785c8d11888e/ovsdbserver-sb/0.log" Mar 13 10:14:43 crc kubenswrapper[4841]: I0313 10:14:43.345630 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-794cb978db-w646s_b4e4f623-6788-4651-95dd-d4fdab2d2b37/placement-api/0.log" Mar 13 10:14:43 crc kubenswrapper[4841]: I0313 10:14:43.360861 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-794cb978db-w646s_b4e4f623-6788-4651-95dd-d4fdab2d2b37/placement-log/0.log" Mar 13 10:14:43 crc kubenswrapper[4841]: I0313 10:14:43.514569 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7/init-config-reloader/0.log" Mar 13 10:14:43 crc kubenswrapper[4841]: I0313 10:14:43.702762 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7/init-config-reloader/0.log" Mar 13 10:14:43 crc kubenswrapper[4841]: I0313 10:14:43.709243 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7/prometheus/0.log" Mar 13 10:14:43 crc kubenswrapper[4841]: I0313 10:14:43.715904 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7/config-reloader/0.log" Mar 13 10:14:43 crc kubenswrapper[4841]: I0313 10:14:43.765955 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7/thanos-sidecar/0.log" Mar 13 10:14:43 crc kubenswrapper[4841]: I0313 10:14:43.890656 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_469aec79-a7a3-4ae1-b00a-94f47a6d4df9/setup-container/0.log" Mar 13 10:14:44 crc kubenswrapper[4841]: I0313 10:14:44.167079 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_469aec79-a7a3-4ae1-b00a-94f47a6d4df9/setup-container/0.log" Mar 13 10:14:44 crc kubenswrapper[4841]: I0313 10:14:44.187117 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_469aec79-a7a3-4ae1-b00a-94f47a6d4df9/rabbitmq/0.log" Mar 13 10:14:44 crc kubenswrapper[4841]: I0313 10:14:44.264585 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b5efe6ff-d5eb-4fa9-9496-1838d05f625a/setup-container/0.log" Mar 13 10:14:44 crc kubenswrapper[4841]: I0313 10:14:44.494380 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn_23e2fc94-fce0-4eeb-8a78-15e934c02371/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:14:44 crc kubenswrapper[4841]: I0313 10:14:44.533711 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b5efe6ff-d5eb-4fa9-9496-1838d05f625a/setup-container/0.log" Mar 13 10:14:44 crc kubenswrapper[4841]: I0313 10:14:44.773049 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-t97xr_b538d385-dcf3-477e-b014-4b304c0be557/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:14:44 crc kubenswrapper[4841]: I0313 10:14:44.954294 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96_413c3ede-4bdb-444c-b90d-5b07c5507a52/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:14:45 crc kubenswrapper[4841]: I0313 10:14:45.078431 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-c9rjj_95faf019-c6d4-4016-87ac-66c7762e56c4/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:14:45 crc kubenswrapper[4841]: I0313 10:14:45.250925 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9wxsh_eb824d89-fddc-4746-8b53-a0f3d5e42082/ssh-known-hosts-edpm-deployment/0.log" Mar 13 10:14:45 crc kubenswrapper[4841]: I0313 10:14:45.447625 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7bbc77c95-pfg84_c7862f13-896f-480f-add9-376c2a96fdd7/proxy-server/0.log" Mar 13 10:14:45 crc kubenswrapper[4841]: I0313 10:14:45.579924 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7bbc77c95-pfg84_c7862f13-896f-480f-add9-376c2a96fdd7/proxy-httpd/0.log" Mar 13 10:14:45 crc kubenswrapper[4841]: I0313 10:14:45.690992 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-sqrfd_f66d8c2c-71a2-4927-a708-4b1412d0243c/swift-ring-rebalance/0.log" Mar 13 10:14:45 crc kubenswrapper[4841]: I0313 10:14:45.843004 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/account-auditor/0.log" Mar 13 10:14:45 crc kubenswrapper[4841]: I0313 10:14:45.903479 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/account-reaper/0.log" Mar 13 10:14:45 crc kubenswrapper[4841]: I0313 10:14:45.994700 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:14:45 crc kubenswrapper[4841]: E0313 10:14:45.994981 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:14:46 crc kubenswrapper[4841]: I0313 10:14:46.023157 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/account-server/0.log" Mar 13 10:14:46 crc kubenswrapper[4841]: I0313 10:14:46.127872 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/account-replicator/0.log" Mar 13 10:14:46 crc kubenswrapper[4841]: I0313 10:14:46.132294 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/container-auditor/0.log" Mar 13 10:14:46 crc kubenswrapper[4841]: I0313 10:14:46.268430 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/container-replicator/0.log" Mar 13 10:14:46 crc kubenswrapper[4841]: I0313 10:14:46.308297 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/container-server/0.log" Mar 13 10:14:46 crc kubenswrapper[4841]: I0313 10:14:46.338347 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/container-updater/0.log" Mar 13 10:14:46 crc kubenswrapper[4841]: I0313 10:14:46.360707 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b5efe6ff-d5eb-4fa9-9496-1838d05f625a/rabbitmq/0.log" Mar 13 10:14:46 crc kubenswrapper[4841]: I0313 10:14:46.514222 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/object-auditor/0.log" Mar 13 10:14:46 crc kubenswrapper[4841]: I0313 10:14:46.537149 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/object-replicator/0.log" Mar 13 10:14:46 crc kubenswrapper[4841]: I0313 10:14:46.562013 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/object-expirer/0.log" Mar 13 10:14:46 crc kubenswrapper[4841]: I0313 10:14:46.588749 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/object-server/0.log" Mar 13 10:14:46 crc kubenswrapper[4841]: I0313 10:14:46.745921 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/rsync/0.log" Mar 13 10:14:46 crc kubenswrapper[4841]: I0313 10:14:46.763749 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/object-updater/0.log" Mar 13 10:14:46 crc kubenswrapper[4841]: I0313 10:14:46.774688 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/swift-recon-cron/0.log" Mar 13 10:14:47 crc kubenswrapper[4841]: I0313 10:14:47.010144 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx_a14a214a-62da-44fc-b3d3-749fff9b3645/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:14:47 crc kubenswrapper[4841]: I0313 10:14:47.089518 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh_2bdd30cd-856d-44bd-8a1f-b68c7291b0ac/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:14:55 crc kubenswrapper[4841]: I0313 10:14:55.977694 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_831e87d6-8c27-4e98-8b3e-e6be93a93e51/memcached/0.log" Mar 13 10:14:58 crc kubenswrapper[4841]: I0313 10:14:58.002484 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:14:58 crc kubenswrapper[4841]: E0313 10:14:58.002921 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:15:00 crc kubenswrapper[4841]: I0313 10:15:00.150599 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb"] Mar 13 10:15:00 crc kubenswrapper[4841]: E0313 10:15:00.151698 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4f7dc4-cc42-48c5-8ec3-f925c6190002" containerName="oc" Mar 13 10:15:00 crc kubenswrapper[4841]: I0313 10:15:00.151715 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4f7dc4-cc42-48c5-8ec3-f925c6190002" containerName="oc" Mar 13 10:15:00 crc kubenswrapper[4841]: I0313 10:15:00.151998 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca4f7dc4-cc42-48c5-8ec3-f925c6190002" containerName="oc" Mar 13 10:15:00 crc kubenswrapper[4841]: I0313 10:15:00.153117 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb" Mar 13 10:15:00 crc kubenswrapper[4841]: I0313 10:15:00.155437 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 10:15:00 crc kubenswrapper[4841]: I0313 10:15:00.155882 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 10:15:00 crc kubenswrapper[4841]: I0313 10:15:00.165587 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb"] Mar 13 10:15:00 crc kubenswrapper[4841]: I0313 10:15:00.317757 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wds79\" (UniqueName: \"kubernetes.io/projected/3af26361-b3e3-455c-98aa-69e6cc6c3f69-kube-api-access-wds79\") pod \"collect-profiles-29556615-8pnvb\" (UID: \"3af26361-b3e3-455c-98aa-69e6cc6c3f69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb" Mar 13 10:15:00 crc kubenswrapper[4841]: I0313 10:15:00.317941 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3af26361-b3e3-455c-98aa-69e6cc6c3f69-secret-volume\") pod \"collect-profiles-29556615-8pnvb\" (UID: \"3af26361-b3e3-455c-98aa-69e6cc6c3f69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb" Mar 13 10:15:00 crc kubenswrapper[4841]: I0313 10:15:00.317988 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3af26361-b3e3-455c-98aa-69e6cc6c3f69-config-volume\") pod \"collect-profiles-29556615-8pnvb\" (UID: \"3af26361-b3e3-455c-98aa-69e6cc6c3f69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb" Mar 13 10:15:00 crc kubenswrapper[4841]: I0313 10:15:00.419689 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3af26361-b3e3-455c-98aa-69e6cc6c3f69-secret-volume\") pod \"collect-profiles-29556615-8pnvb\" (UID: \"3af26361-b3e3-455c-98aa-69e6cc6c3f69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb" Mar 13 10:15:00 crc kubenswrapper[4841]: I0313 10:15:00.419747 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3af26361-b3e3-455c-98aa-69e6cc6c3f69-config-volume\") pod \"collect-profiles-29556615-8pnvb\" (UID: \"3af26361-b3e3-455c-98aa-69e6cc6c3f69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb" Mar 13 10:15:00 crc kubenswrapper[4841]: I0313 10:15:00.419815 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wds79\" (UniqueName: \"kubernetes.io/projected/3af26361-b3e3-455c-98aa-69e6cc6c3f69-kube-api-access-wds79\") pod \"collect-profiles-29556615-8pnvb\" (UID: \"3af26361-b3e3-455c-98aa-69e6cc6c3f69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb" Mar 13 10:15:00 crc kubenswrapper[4841]: I0313 10:15:00.421103 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3af26361-b3e3-455c-98aa-69e6cc6c3f69-config-volume\") pod \"collect-profiles-29556615-8pnvb\" (UID: \"3af26361-b3e3-455c-98aa-69e6cc6c3f69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb" Mar 13 10:15:00 crc kubenswrapper[4841]: I0313 10:15:00.427029 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3af26361-b3e3-455c-98aa-69e6cc6c3f69-secret-volume\") pod \"collect-profiles-29556615-8pnvb\" (UID: \"3af26361-b3e3-455c-98aa-69e6cc6c3f69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb" Mar 13 10:15:00 crc kubenswrapper[4841]: I0313 10:15:00.440439 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wds79\" (UniqueName: \"kubernetes.io/projected/3af26361-b3e3-455c-98aa-69e6cc6c3f69-kube-api-access-wds79\") pod \"collect-profiles-29556615-8pnvb\" (UID: \"3af26361-b3e3-455c-98aa-69e6cc6c3f69\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb" Mar 13 10:15:00 crc kubenswrapper[4841]: I0313 10:15:00.506822 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb" Mar 13 10:15:00 crc kubenswrapper[4841]: I0313 10:15:00.985874 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb"] Mar 13 10:15:01 crc kubenswrapper[4841]: I0313 10:15:01.039193 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb" event={"ID":"3af26361-b3e3-455c-98aa-69e6cc6c3f69","Type":"ContainerStarted","Data":"c2e11eec1a5195a2c2e8b6cf7faded997026012f3e89dddcbab51ed5788e1cbe"} Mar 13 10:15:02 crc kubenswrapper[4841]: I0313 10:15:02.050388 4841 generic.go:334] "Generic (PLEG): container finished" podID="3af26361-b3e3-455c-98aa-69e6cc6c3f69" containerID="04cbd58ad7372f5daa065831b28f79a73fa30ff10caa4fb96fcb24cea2e6da61" exitCode=0 Mar 13 10:15:02 crc kubenswrapper[4841]: I0313 10:15:02.050484 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb" event={"ID":"3af26361-b3e3-455c-98aa-69e6cc6c3f69","Type":"ContainerDied","Data":"04cbd58ad7372f5daa065831b28f79a73fa30ff10caa4fb96fcb24cea2e6da61"} Mar 13 10:15:03 crc kubenswrapper[4841]: I0313 10:15:03.441532 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb" Mar 13 10:15:03 crc kubenswrapper[4841]: I0313 10:15:03.584490 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3af26361-b3e3-455c-98aa-69e6cc6c3f69-config-volume\") pod \"3af26361-b3e3-455c-98aa-69e6cc6c3f69\" (UID: \"3af26361-b3e3-455c-98aa-69e6cc6c3f69\") " Mar 13 10:15:03 crc kubenswrapper[4841]: I0313 10:15:03.584551 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3af26361-b3e3-455c-98aa-69e6cc6c3f69-secret-volume\") pod \"3af26361-b3e3-455c-98aa-69e6cc6c3f69\" (UID: \"3af26361-b3e3-455c-98aa-69e6cc6c3f69\") " Mar 13 10:15:03 crc kubenswrapper[4841]: I0313 10:15:03.584646 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wds79\" (UniqueName: \"kubernetes.io/projected/3af26361-b3e3-455c-98aa-69e6cc6c3f69-kube-api-access-wds79\") pod \"3af26361-b3e3-455c-98aa-69e6cc6c3f69\" (UID: \"3af26361-b3e3-455c-98aa-69e6cc6c3f69\") " Mar 13 10:15:03 crc kubenswrapper[4841]: I0313 10:15:03.585630 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3af26361-b3e3-455c-98aa-69e6cc6c3f69-config-volume" (OuterVolumeSpecName: "config-volume") pod "3af26361-b3e3-455c-98aa-69e6cc6c3f69" (UID: "3af26361-b3e3-455c-98aa-69e6cc6c3f69"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:15:03 crc kubenswrapper[4841]: I0313 10:15:03.591562 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3af26361-b3e3-455c-98aa-69e6cc6c3f69-kube-api-access-wds79" (OuterVolumeSpecName: "kube-api-access-wds79") pod "3af26361-b3e3-455c-98aa-69e6cc6c3f69" (UID: "3af26361-b3e3-455c-98aa-69e6cc6c3f69"). InnerVolumeSpecName "kube-api-access-wds79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:15:03 crc kubenswrapper[4841]: I0313 10:15:03.594421 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3af26361-b3e3-455c-98aa-69e6cc6c3f69-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3af26361-b3e3-455c-98aa-69e6cc6c3f69" (UID: "3af26361-b3e3-455c-98aa-69e6cc6c3f69"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:15:03 crc kubenswrapper[4841]: I0313 10:15:03.687107 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3af26361-b3e3-455c-98aa-69e6cc6c3f69-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 10:15:03 crc kubenswrapper[4841]: I0313 10:15:03.687142 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3af26361-b3e3-455c-98aa-69e6cc6c3f69-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 10:15:03 crc kubenswrapper[4841]: I0313 10:15:03.687152 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wds79\" (UniqueName: \"kubernetes.io/projected/3af26361-b3e3-455c-98aa-69e6cc6c3f69-kube-api-access-wds79\") on node \"crc\" DevicePath \"\"" Mar 13 10:15:04 crc kubenswrapper[4841]: I0313 10:15:04.090413 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb" event={"ID":"3af26361-b3e3-455c-98aa-69e6cc6c3f69","Type":"ContainerDied","Data":"c2e11eec1a5195a2c2e8b6cf7faded997026012f3e89dddcbab51ed5788e1cbe"} Mar 13 10:15:04 crc kubenswrapper[4841]: I0313 10:15:04.090673 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2e11eec1a5195a2c2e8b6cf7faded997026012f3e89dddcbab51ed5788e1cbe" Mar 13 10:15:04 crc kubenswrapper[4841]: I0313 10:15:04.090532 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556615-8pnvb" Mar 13 10:15:04 crc kubenswrapper[4841]: I0313 10:15:04.519952 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl"] Mar 13 10:15:04 crc kubenswrapper[4841]: I0313 10:15:04.529491 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556570-496sl"] Mar 13 10:15:06 crc kubenswrapper[4841]: I0313 10:15:06.015987 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89099fdc-154e-4290-9dbf-31dad846ead6" path="/var/lib/kubelet/pods/89099fdc-154e-4290-9dbf-31dad846ead6/volumes" Mar 13 10:15:06 crc kubenswrapper[4841]: I0313 10:15:06.697465 4841 scope.go:117] "RemoveContainer" containerID="1ea0d6b4991f52ac8bb1bd8ef6f5149bd7286c2405f8fa75aeb65885e492d4e3" Mar 13 10:15:12 crc kubenswrapper[4841]: I0313 10:15:12.384865 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq_a8b2962c-7f7d-4d5b-9982-1668d185c680/util/0.log" Mar 13 10:15:12 crc kubenswrapper[4841]: I0313 10:15:12.550519 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq_a8b2962c-7f7d-4d5b-9982-1668d185c680/pull/0.log" Mar 13 10:15:12 crc kubenswrapper[4841]: I0313 10:15:12.552761 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq_a8b2962c-7f7d-4d5b-9982-1668d185c680/util/0.log" Mar 13 10:15:12 crc kubenswrapper[4841]: I0313 10:15:12.607067 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq_a8b2962c-7f7d-4d5b-9982-1668d185c680/pull/0.log" Mar 13 10:15:12 crc kubenswrapper[4841]: I0313 10:15:12.783562 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq_a8b2962c-7f7d-4d5b-9982-1668d185c680/extract/0.log" Mar 13 10:15:12 crc kubenswrapper[4841]: I0313 10:15:12.783729 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq_a8b2962c-7f7d-4d5b-9982-1668d185c680/pull/0.log" Mar 13 10:15:12 crc kubenswrapper[4841]: I0313 10:15:12.842411 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq_a8b2962c-7f7d-4d5b-9982-1668d185c680/util/0.log" Mar 13 10:15:12 crc kubenswrapper[4841]: I0313 10:15:12.994632 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:15:12 crc kubenswrapper[4841]: E0313 10:15:12.994948 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:15:13 crc kubenswrapper[4841]: I0313 10:15:13.239177 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-kc2zl_c6c9dfcd-5298-468b-9de2-0280bf525b61/manager/0.log" Mar 13 10:15:13 crc kubenswrapper[4841]: I0313 10:15:13.810200 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-7lhjg_46ca7c55-bd68-4454-a014-85f81f1b5a60/manager/0.log" Mar 13 10:15:14 crc kubenswrapper[4841]: I0313 10:15:14.095557 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-hm824_e6b9c8a5-3093-4d94-ad46-cd682158fdf8/manager/0.log" Mar 13 10:15:14 crc kubenswrapper[4841]: I0313 10:15:14.344678 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-grn97_bad08b57-dde0-496d-8ea1-5845a52d517a/manager/0.log" Mar 13 10:15:14 crc kubenswrapper[4841]: I0313 10:15:14.372018 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-cgkfz_7bd9dd2f-b4fd-4078-b463-4e970fa6791d/manager/0.log" Mar 13 10:15:14 crc kubenswrapper[4841]: I0313 10:15:14.657796 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-gb4dz_98cec4ba-d672-4627-8d37-46a0684fc284/manager/0.log" Mar 13 10:15:15 crc kubenswrapper[4841]: I0313 10:15:15.018590 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-56nwc_4a76403b-081b-4222-a707-4cd00dd440a0/manager/0.log" Mar 13 10:15:15 crc kubenswrapper[4841]: I0313 10:15:15.049928 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-vtz8l_26236923-39c0-4b46-be0d-61f453533891/manager/0.log" Mar 13 10:15:15 crc kubenswrapper[4841]: I0313 10:15:15.203286 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-9gcr6_44d006b0-b13e-49ce-8ff8-592f3d8798c1/manager/0.log" Mar 13 10:15:15 crc kubenswrapper[4841]: I0313 10:15:15.406747 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-rx4wh_90c1dec3-4daa-4ac6-b95e-209cb8bd9b55/manager/0.log" Mar 13 10:15:15 crc kubenswrapper[4841]: I0313 10:15:15.560292 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-8f2sj_8b77ae90-8ef1-4e98-9d32-319dfdd55a6d/manager/0.log" Mar 13 10:15:15 crc kubenswrapper[4841]: I0313 10:15:15.845138 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-chf9m_f34e0b2d-5c3c-4725-ae0c-760bf98e90d3/manager/0.log" Mar 13 10:15:15 crc kubenswrapper[4841]: I0313 10:15:15.891222 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-8tg7w_b877a309-e752-4f24-90cd-6901973263e3/manager/0.log" Mar 13 10:15:16 crc kubenswrapper[4841]: I0313 10:15:16.084854 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7lccpq_06f0c42d-1674-4913-8a86-1d1749d8d601/manager/0.log" Mar 13 10:15:16 crc kubenswrapper[4841]: I0313 10:15:16.586904 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6bc596d67-h66cj_08c8df77-ecb6-4f32-b9a1-b31bf7a0d1c4/operator/0.log" Mar 13 10:15:16 crc kubenswrapper[4841]: I0313 10:15:16.784931 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-n25n5_7290f225-3489-4643-916d-39a67a36acb2/registry-server/0.log" Mar 13 10:15:17 crc kubenswrapper[4841]: I0313 10:15:17.033048 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-j7v7h_3a7d5a0b-0bd7-4735-b182-8a78870050cf/manager/0.log" Mar 13 10:15:17 crc kubenswrapper[4841]: I0313 10:15:17.148858 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-bwvx6_49a6b3bc-4db3-4006-b033-cc9cfa0cb5fc/manager/0.log" Mar 13 10:15:17 crc kubenswrapper[4841]: I0313 10:15:17.327305 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-nd9d9_daeb73dc-4973-4a0b-906d-4afc7f61717c/operator/0.log" Mar 13 10:15:17 crc kubenswrapper[4841]: I0313 10:15:17.465829 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-r42dd_10b15182-cc2b-420b-9fc2-fe3ca6ea38d7/manager/0.log" Mar 13 10:15:17 crc kubenswrapper[4841]: I0313 10:15:17.808777 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-8v6pq_35cf5dc3-b1c0-4481-8f0f-8bca19ecadd1/manager/0.log" Mar 13 10:15:17 crc kubenswrapper[4841]: I0313 10:15:17.900579 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-8485bdb9db-mf5lp_17824e5f-18b3-46c0-910a-56e5529e09c3/manager/0.log" Mar 13 10:15:18 crc kubenswrapper[4841]: I0313 10:15:18.050087 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-pvtvx_f5ae27e8-47b9-437c-9506-f51da1b6c9f8/manager/0.log" Mar 13 10:15:18 crc kubenswrapper[4841]: I0313 10:15:18.588590 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-57ddc6f479-h7khw_2c86df2d-15dc-45f2-aca7-4200fdf36a53/manager/0.log" Mar 13 10:15:20 crc kubenswrapper[4841]: I0313 10:15:20.452553 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-fkfjh_9db8c27e-023c-4e28-a381-24f4438a6add/manager/0.log" Mar 13 10:15:25 crc kubenswrapper[4841]: I0313 10:15:25.994769 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:15:25 crc kubenswrapper[4841]: E0313 10:15:25.995512 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:15:36 crc kubenswrapper[4841]: I0313 10:15:36.433592 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-wh6kv_adeab6d7-21b6-4ef2-afdb-75854f0914c5/control-plane-machine-set-operator/0.log" Mar 13 10:15:36 crc kubenswrapper[4841]: I0313 10:15:36.638891 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qlg79_747a24f9-e654-421a-8da1-0be0aa6ccd9b/kube-rbac-proxy/0.log" Mar 13 10:15:36 crc kubenswrapper[4841]: I0313 10:15:36.640061 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qlg79_747a24f9-e654-421a-8da1-0be0aa6ccd9b/machine-api-operator/0.log" Mar 13 10:15:40 crc kubenswrapper[4841]: I0313 10:15:40.994970 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:15:40 crc kubenswrapper[4841]: E0313 10:15:40.995751 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:15:48 crc kubenswrapper[4841]: I0313 10:15:48.293513 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-crfbh_0f26e065-6f9d-4f61-a645-ea11d7f0eb85/cert-manager-controller/0.log" Mar 13 10:15:48 crc kubenswrapper[4841]: I0313 10:15:48.398222 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-dqzmg_b71327b9-8538-404b-b37d-cfb16da13ce4/cert-manager-cainjector/0.log" Mar 13 10:15:48 crc kubenswrapper[4841]: I0313 10:15:48.484917 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-vqq7j_318e486a-97f3-45fb-84b7-816009810d33/cert-manager-webhook/0.log" Mar 13 10:15:54 crc kubenswrapper[4841]: I0313 10:15:54.995819 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:15:54 crc kubenswrapper[4841]: E0313 10:15:54.996916 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:16:00 crc kubenswrapper[4841]: I0313 10:16:00.153748 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556616-g99gc"] Mar 13 10:16:00 crc kubenswrapper[4841]: E0313 10:16:00.154573 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af26361-b3e3-455c-98aa-69e6cc6c3f69" containerName="collect-profiles" Mar 13 10:16:00 crc kubenswrapper[4841]: I0313 10:16:00.154631 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af26361-b3e3-455c-98aa-69e6cc6c3f69" containerName="collect-profiles" Mar 13 10:16:00 crc kubenswrapper[4841]: I0313 10:16:00.154832 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3af26361-b3e3-455c-98aa-69e6cc6c3f69" containerName="collect-profiles" Mar 13 10:16:00 crc kubenswrapper[4841]: I0313 10:16:00.155679 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556616-g99gc" Mar 13 10:16:00 crc kubenswrapper[4841]: I0313 10:16:00.162320 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 10:16:00 crc kubenswrapper[4841]: I0313 10:16:00.175096 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 10:16:00 crc kubenswrapper[4841]: I0313 10:16:00.175107 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 10:16:00 crc kubenswrapper[4841]: I0313 10:16:00.180315 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-4tq4s_80edf45c-fbb9-4761-995a-010a15e0b1dc/nmstate-console-plugin/0.log" Mar 13 10:16:00 crc kubenswrapper[4841]: I0313 10:16:00.186327 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556616-g99gc"] Mar 13 10:16:00 crc kubenswrapper[4841]: I0313 10:16:00.246532 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv5rc\" (UniqueName: \"kubernetes.io/projected/7ecea745-1b2d-4d7e-a791-5741c1757e51-kube-api-access-mv5rc\") pod \"auto-csr-approver-29556616-g99gc\" (UID: \"7ecea745-1b2d-4d7e-a791-5741c1757e51\") " pod="openshift-infra/auto-csr-approver-29556616-g99gc" Mar 13 10:16:00 crc kubenswrapper[4841]: I0313 10:16:00.321875 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-c69x9_c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2/nmstate-handler/0.log" Mar 13 10:16:00 crc kubenswrapper[4841]: I0313 10:16:00.348714 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv5rc\" (UniqueName: \"kubernetes.io/projected/7ecea745-1b2d-4d7e-a791-5741c1757e51-kube-api-access-mv5rc\") pod \"auto-csr-approver-29556616-g99gc\" (UID: \"7ecea745-1b2d-4d7e-a791-5741c1757e51\") " pod="openshift-infra/auto-csr-approver-29556616-g99gc" Mar 13 10:16:00 crc kubenswrapper[4841]: I0313 10:16:00.372755 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv5rc\" (UniqueName: \"kubernetes.io/projected/7ecea745-1b2d-4d7e-a791-5741c1757e51-kube-api-access-mv5rc\") pod \"auto-csr-approver-29556616-g99gc\" (UID: \"7ecea745-1b2d-4d7e-a791-5741c1757e51\") " pod="openshift-infra/auto-csr-approver-29556616-g99gc" Mar 13 10:16:00 crc kubenswrapper[4841]: I0313 10:16:00.424562 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-c2r92_4b560358-f566-41b2-a5da-89b9b3c173f3/kube-rbac-proxy/0.log" Mar 13 10:16:00 crc kubenswrapper[4841]: I0313 10:16:00.439335 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-c2r92_4b560358-f566-41b2-a5da-89b9b3c173f3/nmstate-metrics/0.log" Mar 13 10:16:00 crc kubenswrapper[4841]: I0313 10:16:00.479921 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556616-g99gc" Mar 13 10:16:00 crc kubenswrapper[4841]: I0313 10:16:00.653032 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-w54ng_1422b359-9a6d-430e-8cb6-5cf498e32422/nmstate-operator/0.log" Mar 13 10:16:00 crc kubenswrapper[4841]: I0313 10:16:00.667718 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-nhjzn_1080bc76-f294-4c2b-8a4b-165d657a4057/nmstate-webhook/0.log" Mar 13 10:16:01 crc kubenswrapper[4841]: I0313 10:16:01.020736 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556616-g99gc"] Mar 13 10:16:01 crc kubenswrapper[4841]: I0313 10:16:01.638844 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556616-g99gc" event={"ID":"7ecea745-1b2d-4d7e-a791-5741c1757e51","Type":"ContainerStarted","Data":"757d486d7277eb453cf9c33f57ee99ec7084c652618a0df03fc49efd0a14fb1e"} Mar 13 10:16:02 crc kubenswrapper[4841]: I0313 10:16:02.652452 4841 generic.go:334] "Generic (PLEG): container finished" podID="7ecea745-1b2d-4d7e-a791-5741c1757e51" containerID="cfcd4490f65a3b6445ca3f33b47c66ce5e1da712532cd323e6ffbe128043f05f" exitCode=0 Mar 13 10:16:02 crc kubenswrapper[4841]: I0313 10:16:02.652709 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556616-g99gc" event={"ID":"7ecea745-1b2d-4d7e-a791-5741c1757e51","Type":"ContainerDied","Data":"cfcd4490f65a3b6445ca3f33b47c66ce5e1da712532cd323e6ffbe128043f05f"} Mar 13 10:16:04 crc kubenswrapper[4841]: I0313 10:16:04.076661 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556616-g99gc" Mar 13 10:16:04 crc kubenswrapper[4841]: I0313 10:16:04.244468 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv5rc\" (UniqueName: \"kubernetes.io/projected/7ecea745-1b2d-4d7e-a791-5741c1757e51-kube-api-access-mv5rc\") pod \"7ecea745-1b2d-4d7e-a791-5741c1757e51\" (UID: \"7ecea745-1b2d-4d7e-a791-5741c1757e51\") " Mar 13 10:16:04 crc kubenswrapper[4841]: I0313 10:16:04.251379 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ecea745-1b2d-4d7e-a791-5741c1757e51-kube-api-access-mv5rc" (OuterVolumeSpecName: "kube-api-access-mv5rc") pod "7ecea745-1b2d-4d7e-a791-5741c1757e51" (UID: "7ecea745-1b2d-4d7e-a791-5741c1757e51"). InnerVolumeSpecName "kube-api-access-mv5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:16:04 crc kubenswrapper[4841]: I0313 10:16:04.347705 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv5rc\" (UniqueName: \"kubernetes.io/projected/7ecea745-1b2d-4d7e-a791-5741c1757e51-kube-api-access-mv5rc\") on node \"crc\" DevicePath \"\"" Mar 13 10:16:04 crc kubenswrapper[4841]: I0313 10:16:04.672330 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556616-g99gc" Mar 13 10:16:04 crc kubenswrapper[4841]: I0313 10:16:04.677355 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556616-g99gc" event={"ID":"7ecea745-1b2d-4d7e-a791-5741c1757e51","Type":"ContainerDied","Data":"757d486d7277eb453cf9c33f57ee99ec7084c652618a0df03fc49efd0a14fb1e"} Mar 13 10:16:04 crc kubenswrapper[4841]: I0313 10:16:04.677409 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="757d486d7277eb453cf9c33f57ee99ec7084c652618a0df03fc49efd0a14fb1e" Mar 13 10:16:05 crc kubenswrapper[4841]: I0313 10:16:05.151475 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556610-94kqr"] Mar 13 10:16:05 crc kubenswrapper[4841]: I0313 10:16:05.161221 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556610-94kqr"] Mar 13 10:16:06 crc kubenswrapper[4841]: I0313 10:16:06.007777 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="238e9451-eeab-4566-bb9e-3abdaa91978c" path="/var/lib/kubelet/pods/238e9451-eeab-4566-bb9e-3abdaa91978c/volumes" Mar 13 10:16:06 crc kubenswrapper[4841]: I0313 10:16:06.752989 4841 scope.go:117] "RemoveContainer" containerID="ce31be1973248023d9b9155a156c3fa9c96133a4fa3b41a43455b1fe4db14dd2" Mar 13 10:16:08 crc kubenswrapper[4841]: I0313 10:16:08.002669 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:16:08 crc kubenswrapper[4841]: E0313 10:16:08.004241 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:16:14 crc kubenswrapper[4841]: I0313 10:16:14.261532 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-z7wl5_08be6515-b41c-481b-ba89-b939e4cfa067/prometheus-operator/0.log" Mar 13 10:16:14 crc kubenswrapper[4841]: I0313 10:16:14.476634 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5586657968-646dp_3304bfd0-8191-45c7-8c50-f16e137a6de8/prometheus-operator-admission-webhook/0.log" Mar 13 10:16:14 crc kubenswrapper[4841]: I0313 10:16:14.557483 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5586657968-njcw5_3fde31d7-89e1-4aa5-a848-2b018eae16b1/prometheus-operator-admission-webhook/0.log" Mar 13 10:16:14 crc kubenswrapper[4841]: I0313 10:16:14.713862 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-5vnkd_64eb3c86-385d-45d5-8dee-df851d8c3a74/operator/0.log" Mar 13 10:16:14 crc kubenswrapper[4841]: I0313 10:16:14.846128 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-r5b4d_b49125a7-562a-421b-b5eb-126312e6e85d/perses-operator/0.log" Mar 13 10:16:18 crc kubenswrapper[4841]: I0313 10:16:18.995275 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:16:18 crc kubenswrapper[4841]: E0313 10:16:18.996051 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:16:27 crc kubenswrapper[4841]: I0313 10:16:27.538477 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-6x8lt_69db2e0c-e892-4c3c-909b-3f7ba4d650bb/kube-rbac-proxy/0.log" Mar 13 10:16:27 crc kubenswrapper[4841]: I0313 10:16:27.650772 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-6x8lt_69db2e0c-e892-4c3c-909b-3f7ba4d650bb/controller/0.log" Mar 13 10:16:27 crc kubenswrapper[4841]: I0313 10:16:27.787822 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-frr-files/0.log" Mar 13 10:16:27 crc kubenswrapper[4841]: I0313 10:16:27.966521 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-frr-files/0.log" Mar 13 10:16:27 crc kubenswrapper[4841]: I0313 10:16:27.975128 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-metrics/0.log" Mar 13 10:16:27 crc kubenswrapper[4841]: I0313 10:16:27.978672 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-reloader/0.log" Mar 13 10:16:28 crc kubenswrapper[4841]: I0313 10:16:28.014532 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-reloader/0.log" Mar 13 10:16:28 crc kubenswrapper[4841]: I0313 10:16:28.165063 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-metrics/0.log" Mar 13 10:16:28 crc kubenswrapper[4841]: I0313 10:16:28.182346 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-reloader/0.log" Mar 13 10:16:28 crc kubenswrapper[4841]: I0313 10:16:28.205749 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-frr-files/0.log" Mar 13 10:16:28 crc kubenswrapper[4841]: I0313 10:16:28.237618 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-metrics/0.log" Mar 13 10:16:28 crc kubenswrapper[4841]: I0313 10:16:28.433712 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-frr-files/0.log" Mar 13 10:16:28 crc kubenswrapper[4841]: I0313 10:16:28.436692 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-reloader/0.log" Mar 13 10:16:28 crc kubenswrapper[4841]: I0313 10:16:28.465174 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-metrics/0.log" Mar 13 10:16:28 crc kubenswrapper[4841]: I0313 10:16:28.474519 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/controller/0.log" Mar 13 10:16:28 crc kubenswrapper[4841]: I0313 10:16:28.990690 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/frr-metrics/0.log" Mar 13 10:16:29 crc kubenswrapper[4841]: I0313 10:16:29.031229 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/kube-rbac-proxy/0.log" Mar 13 10:16:29 crc kubenswrapper[4841]: I0313 10:16:29.076829 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/kube-rbac-proxy-frr/0.log" Mar 13 10:16:29 crc kubenswrapper[4841]: I0313 10:16:29.178933 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/reloader/0.log" Mar 13 10:16:29 crc kubenswrapper[4841]: I0313 10:16:29.902821 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-qnhkg_1a96f06d-396f-44a0-a357-f8b615676b3f/frr-k8s-webhook-server/0.log" Mar 13 10:16:29 crc kubenswrapper[4841]: I0313 10:16:29.919382 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6d6c4d5946-gtbzk_683811db-740f-4604-b93b-c8134590a46a/manager/0.log" Mar 13 10:16:30 crc kubenswrapper[4841]: I0313 10:16:30.080485 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-744cf67d4f-ldddk_2735aa21-2a11-4909-988a-f2add6dae771/webhook-server/0.log" Mar 13 10:16:30 crc kubenswrapper[4841]: I0313 10:16:30.138584 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lpmg6_62891ab6-67e5-4c9e-83b6-aec814f74ca6/kube-rbac-proxy/0.log" Mar 13 10:16:30 crc kubenswrapper[4841]: I0313 10:16:30.825465 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lpmg6_62891ab6-67e5-4c9e-83b6-aec814f74ca6/speaker/0.log" Mar 13 10:16:31 crc kubenswrapper[4841]: I0313 10:16:31.629323 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/frr/0.log" Mar 13 10:16:32 crc kubenswrapper[4841]: I0313 10:16:32.995159 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:16:32 crc kubenswrapper[4841]: E0313 10:16:32.995737 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:16:42 crc kubenswrapper[4841]: I0313 10:16:42.482558 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc_b11048a2-12d1-437e-80b5-05e10ccc4b50/util/0.log" Mar 13 10:16:42 crc kubenswrapper[4841]: I0313 10:16:42.656598 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc_b11048a2-12d1-437e-80b5-05e10ccc4b50/pull/0.log" Mar 13 10:16:42 crc kubenswrapper[4841]: I0313 10:16:42.693508 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc_b11048a2-12d1-437e-80b5-05e10ccc4b50/pull/0.log" Mar 13 10:16:42 crc kubenswrapper[4841]: I0313 10:16:42.695181 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc_b11048a2-12d1-437e-80b5-05e10ccc4b50/util/0.log" Mar 13 10:16:42 crc kubenswrapper[4841]: I0313 10:16:42.827406 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc_b11048a2-12d1-437e-80b5-05e10ccc4b50/util/0.log" Mar 13 10:16:42 crc kubenswrapper[4841]: I0313 10:16:42.852590 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc_b11048a2-12d1-437e-80b5-05e10ccc4b50/pull/0.log" Mar 13 10:16:42 crc kubenswrapper[4841]: I0313 10:16:42.903676 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc_b11048a2-12d1-437e-80b5-05e10ccc4b50/extract/0.log" Mar 13 10:16:43 crc kubenswrapper[4841]: I0313 10:16:43.000369 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s_8b79260a-a276-45fa-abfd-5d471f82142a/util/0.log" Mar 13 10:16:43 crc kubenswrapper[4841]: I0313 10:16:43.167624 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s_8b79260a-a276-45fa-abfd-5d471f82142a/util/0.log" Mar 13 10:16:43 crc kubenswrapper[4841]: I0313 10:16:43.184376 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s_8b79260a-a276-45fa-abfd-5d471f82142a/pull/0.log" Mar 13 10:16:43 crc kubenswrapper[4841]: I0313 10:16:43.196690 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s_8b79260a-a276-45fa-abfd-5d471f82142a/pull/0.log" Mar 13 10:16:43 crc kubenswrapper[4841]: I0313 10:16:43.342559 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s_8b79260a-a276-45fa-abfd-5d471f82142a/util/0.log" Mar 13 10:16:43 crc kubenswrapper[4841]: I0313 10:16:43.400218 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s_8b79260a-a276-45fa-abfd-5d471f82142a/extract/0.log" Mar 13 10:16:43 crc kubenswrapper[4841]: I0313 10:16:43.408366 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s_8b79260a-a276-45fa-abfd-5d471f82142a/pull/0.log" Mar 13 10:16:43 crc kubenswrapper[4841]: I0313 10:16:43.526849 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6_646b8b76-d9eb-4e25-bbe5-b6d42b9f0961/util/0.log" Mar 13 10:16:43 crc kubenswrapper[4841]: I0313 10:16:43.676983 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6_646b8b76-d9eb-4e25-bbe5-b6d42b9f0961/pull/0.log" Mar 13 10:16:43 crc kubenswrapper[4841]: I0313 10:16:43.698950 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6_646b8b76-d9eb-4e25-bbe5-b6d42b9f0961/util/0.log" Mar 13 10:16:43 crc kubenswrapper[4841]: I0313 10:16:43.708335 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6_646b8b76-d9eb-4e25-bbe5-b6d42b9f0961/pull/0.log" Mar 13 10:16:43 crc kubenswrapper[4841]: I0313 10:16:43.897576 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6_646b8b76-d9eb-4e25-bbe5-b6d42b9f0961/pull/0.log" Mar 13 10:16:43 crc kubenswrapper[4841]: I0313 10:16:43.930617 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6_646b8b76-d9eb-4e25-bbe5-b6d42b9f0961/util/0.log" Mar 13 10:16:43 crc kubenswrapper[4841]: I0313 10:16:43.936580 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6_646b8b76-d9eb-4e25-bbe5-b6d42b9f0961/extract/0.log" Mar 13 10:16:44 crc kubenswrapper[4841]: I0313 10:16:44.092152 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-96j6p_44d8c242-fbc8-4c6a-93b1-146498533256/extract-utilities/0.log" Mar 13 10:16:44 crc kubenswrapper[4841]: I0313 10:16:44.297175 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-96j6p_44d8c242-fbc8-4c6a-93b1-146498533256/extract-utilities/0.log" Mar 13 10:16:44 crc kubenswrapper[4841]: I0313 10:16:44.325562 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-96j6p_44d8c242-fbc8-4c6a-93b1-146498533256/extract-content/0.log" Mar 13 10:16:44 crc kubenswrapper[4841]: I0313 10:16:44.348560 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-96j6p_44d8c242-fbc8-4c6a-93b1-146498533256/extract-content/0.log" Mar 13 10:16:44 crc kubenswrapper[4841]: I0313 10:16:44.481931 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-96j6p_44d8c242-fbc8-4c6a-93b1-146498533256/extract-utilities/0.log" Mar 13 10:16:44 crc kubenswrapper[4841]: I0313 10:16:44.526902 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-96j6p_44d8c242-fbc8-4c6a-93b1-146498533256/extract-content/0.log" Mar 13 10:16:44 crc kubenswrapper[4841]: I0313 10:16:44.681720 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szx4f_eb0bb4de-58c3-4d6e-a22d-735e9346f228/extract-utilities/0.log" Mar 13 10:16:45 crc kubenswrapper[4841]: I0313 10:16:45.145078 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-96j6p_44d8c242-fbc8-4c6a-93b1-146498533256/registry-server/0.log" Mar 13 10:16:45 crc kubenswrapper[4841]: I0313 10:16:45.339875 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szx4f_eb0bb4de-58c3-4d6e-a22d-735e9346f228/extract-content/0.log" Mar 13 10:16:45 crc kubenswrapper[4841]: I0313 10:16:45.358439 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szx4f_eb0bb4de-58c3-4d6e-a22d-735e9346f228/extract-content/0.log" Mar 13 10:16:45 crc kubenswrapper[4841]: I0313 10:16:45.366240 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szx4f_eb0bb4de-58c3-4d6e-a22d-735e9346f228/extract-utilities/0.log" Mar 13 10:16:45 crc kubenswrapper[4841]: I0313 10:16:45.558284 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szx4f_eb0bb4de-58c3-4d6e-a22d-735e9346f228/extract-utilities/0.log" Mar 13 10:16:45 crc kubenswrapper[4841]: I0313 10:16:45.597088 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szx4f_eb0bb4de-58c3-4d6e-a22d-735e9346f228/extract-content/0.log" Mar 13 10:16:45 crc kubenswrapper[4841]: I0313 10:16:45.849950 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgpqr_6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c/extract-utilities/0.log" Mar 13 10:16:45 crc kubenswrapper[4841]: I0313 10:16:45.859258 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4n6sl_f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45/marketplace-operator/0.log" Mar 13 10:16:45 crc kubenswrapper[4841]: I0313 10:16:45.879792 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szx4f_eb0bb4de-58c3-4d6e-a22d-735e9346f228/registry-server/0.log" Mar 13 10:16:46 crc kubenswrapper[4841]: I0313 10:16:46.060249 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgpqr_6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c/extract-content/0.log" Mar 13 10:16:46 crc kubenswrapper[4841]: I0313 10:16:46.074490 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgpqr_6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c/extract-utilities/0.log" Mar 13 10:16:46 crc kubenswrapper[4841]: I0313 10:16:46.077684 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgpqr_6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c/extract-content/0.log" Mar 13 10:16:46 crc kubenswrapper[4841]: I0313 10:16:46.224236 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgpqr_6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c/extract-content/0.log" Mar 13 10:16:46 crc kubenswrapper[4841]: I0313 10:16:46.246437 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgpqr_6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c/extract-utilities/0.log" Mar 13 10:16:46 crc kubenswrapper[4841]: I0313 10:16:46.397769 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgpqr_6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c/registry-server/0.log" Mar 13 10:16:46 crc kubenswrapper[4841]: I0313 10:16:46.464188 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pqqf_5099aa18-a4a1-40d1-b8c2-dc8a5a26e912/extract-utilities/0.log" Mar 13 10:16:47 crc kubenswrapper[4841]: I0313 10:16:47.062601 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pqqf_5099aa18-a4a1-40d1-b8c2-dc8a5a26e912/extract-content/0.log" Mar 13 10:16:47 crc kubenswrapper[4841]: I0313 10:16:47.106986 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pqqf_5099aa18-a4a1-40d1-b8c2-dc8a5a26e912/extract-content/0.log" Mar 13 10:16:47 crc kubenswrapper[4841]: I0313 10:16:47.136124 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pqqf_5099aa18-a4a1-40d1-b8c2-dc8a5a26e912/extract-utilities/0.log" Mar 13 10:16:47 crc kubenswrapper[4841]: I0313 10:16:47.268797 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pqqf_5099aa18-a4a1-40d1-b8c2-dc8a5a26e912/extract-utilities/0.log" Mar 13 10:16:47 crc kubenswrapper[4841]: I0313 10:16:47.287228 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pqqf_5099aa18-a4a1-40d1-b8c2-dc8a5a26e912/extract-content/0.log" Mar 13 10:16:47 crc kubenswrapper[4841]: I0313 10:16:47.927491 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pqqf_5099aa18-a4a1-40d1-b8c2-dc8a5a26e912/registry-server/0.log" Mar 13 10:16:47 crc kubenswrapper[4841]: I0313 10:16:47.995865 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:16:47 crc kubenswrapper[4841]: E0313 10:16:47.996206 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:16:58 crc kubenswrapper[4841]: I0313 10:16:58.995458 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:16:58 crc kubenswrapper[4841]: E0313 10:16:58.996334 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:16:59 crc kubenswrapper[4841]: I0313 10:16:59.344129 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5586657968-646dp_3304bfd0-8191-45c7-8c50-f16e137a6de8/prometheus-operator-admission-webhook/0.log" Mar 13 10:16:59 crc kubenswrapper[4841]: I0313 10:16:59.369979 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-z7wl5_08be6515-b41c-481b-ba89-b939e4cfa067/prometheus-operator/0.log" Mar 13 10:16:59 crc kubenswrapper[4841]: I0313 10:16:59.406999 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5586657968-njcw5_3fde31d7-89e1-4aa5-a848-2b018eae16b1/prometheus-operator-admission-webhook/0.log" Mar 13 10:16:59 crc kubenswrapper[4841]: I0313 10:16:59.551982 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-5vnkd_64eb3c86-385d-45d5-8dee-df851d8c3a74/operator/0.log" Mar 13 10:16:59 crc kubenswrapper[4841]: I0313 10:16:59.581003 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-r5b4d_b49125a7-562a-421b-b5eb-126312e6e85d/perses-operator/0.log" Mar 13 10:17:13 crc kubenswrapper[4841]: I0313 10:17:13.995379 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:17:13 crc kubenswrapper[4841]: E0313 10:17:13.996241 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:17:26 crc kubenswrapper[4841]: I0313 10:17:26.995024 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:17:26 crc kubenswrapper[4841]: E0313 10:17:26.995893 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:17:28 crc kubenswrapper[4841]: I0313 10:17:28.921281 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8gx9j"] Mar 13 10:17:28 crc kubenswrapper[4841]: E0313 10:17:28.922818 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ecea745-1b2d-4d7e-a791-5741c1757e51" containerName="oc" Mar 13 10:17:28 crc kubenswrapper[4841]: I0313 10:17:28.922938 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ecea745-1b2d-4d7e-a791-5741c1757e51" containerName="oc" Mar 13 10:17:28 crc kubenswrapper[4841]: I0313 10:17:28.923257 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ecea745-1b2d-4d7e-a791-5741c1757e51" containerName="oc" Mar 13 10:17:28 crc kubenswrapper[4841]: I0313 10:17:28.925209 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8gx9j" Mar 13 10:17:29 crc kubenswrapper[4841]: I0313 10:17:28.933162 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8gx9j"] Mar 13 10:17:29 crc kubenswrapper[4841]: I0313 10:17:29.172892 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zbfd\" (UniqueName: \"kubernetes.io/projected/afefb0ab-d03e-41d9-81de-ae9a4c9d198f-kube-api-access-5zbfd\") pod \"community-operators-8gx9j\" (UID: \"afefb0ab-d03e-41d9-81de-ae9a4c9d198f\") " pod="openshift-marketplace/community-operators-8gx9j" Mar 13 10:17:29 crc kubenswrapper[4841]: I0313 10:17:29.173411 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afefb0ab-d03e-41d9-81de-ae9a4c9d198f-catalog-content\") pod \"community-operators-8gx9j\" (UID: \"afefb0ab-d03e-41d9-81de-ae9a4c9d198f\") " pod="openshift-marketplace/community-operators-8gx9j" Mar 13 10:17:29 crc kubenswrapper[4841]: I0313 10:17:29.173646 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afefb0ab-d03e-41d9-81de-ae9a4c9d198f-utilities\") pod \"community-operators-8gx9j\" (UID: \"afefb0ab-d03e-41d9-81de-ae9a4c9d198f\") " pod="openshift-marketplace/community-operators-8gx9j" Mar 13 10:17:29 crc kubenswrapper[4841]: I0313 10:17:29.276583 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zbfd\" (UniqueName: \"kubernetes.io/projected/afefb0ab-d03e-41d9-81de-ae9a4c9d198f-kube-api-access-5zbfd\") pod \"community-operators-8gx9j\" (UID: \"afefb0ab-d03e-41d9-81de-ae9a4c9d198f\") " pod="openshift-marketplace/community-operators-8gx9j" Mar 13 10:17:29 crc kubenswrapper[4841]: I0313 10:17:29.276680 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afefb0ab-d03e-41d9-81de-ae9a4c9d198f-catalog-content\") pod \"community-operators-8gx9j\" (UID: \"afefb0ab-d03e-41d9-81de-ae9a4c9d198f\") " pod="openshift-marketplace/community-operators-8gx9j" Mar 13 10:17:29 crc kubenswrapper[4841]: I0313 10:17:29.276771 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afefb0ab-d03e-41d9-81de-ae9a4c9d198f-utilities\") pod \"community-operators-8gx9j\" (UID: \"afefb0ab-d03e-41d9-81de-ae9a4c9d198f\") " pod="openshift-marketplace/community-operators-8gx9j" Mar 13 10:17:29 crc kubenswrapper[4841]: I0313 10:17:29.277475 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afefb0ab-d03e-41d9-81de-ae9a4c9d198f-utilities\") pod \"community-operators-8gx9j\" (UID: \"afefb0ab-d03e-41d9-81de-ae9a4c9d198f\") " pod="openshift-marketplace/community-operators-8gx9j" Mar 13 10:17:29 crc kubenswrapper[4841]: I0313 10:17:29.277684 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afefb0ab-d03e-41d9-81de-ae9a4c9d198f-catalog-content\") pod \"community-operators-8gx9j\" (UID: \"afefb0ab-d03e-41d9-81de-ae9a4c9d198f\") " pod="openshift-marketplace/community-operators-8gx9j" Mar 13 10:17:29 crc kubenswrapper[4841]: I0313 10:17:29.305053 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zbfd\" (UniqueName: \"kubernetes.io/projected/afefb0ab-d03e-41d9-81de-ae9a4c9d198f-kube-api-access-5zbfd\") pod \"community-operators-8gx9j\" (UID: \"afefb0ab-d03e-41d9-81de-ae9a4c9d198f\") " pod="openshift-marketplace/community-operators-8gx9j" Mar 13 10:17:29 crc kubenswrapper[4841]: I0313 10:17:29.392222 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8gx9j" Mar 13 10:17:29 crc kubenswrapper[4841]: I0313 10:17:29.979634 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8gx9j"] Mar 13 10:17:30 crc kubenswrapper[4841]: I0313 10:17:30.554100 4841 generic.go:334] "Generic (PLEG): container finished" podID="afefb0ab-d03e-41d9-81de-ae9a4c9d198f" containerID="3feb7a5a3f49fa8c5824a8dc15c3963b51537c5ddad7f223a40fb47684cf3876" exitCode=0 Mar 13 10:17:30 crc kubenswrapper[4841]: I0313 10:17:30.554204 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gx9j" event={"ID":"afefb0ab-d03e-41d9-81de-ae9a4c9d198f","Type":"ContainerDied","Data":"3feb7a5a3f49fa8c5824a8dc15c3963b51537c5ddad7f223a40fb47684cf3876"} Mar 13 10:17:30 crc kubenswrapper[4841]: I0313 10:17:30.554648 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gx9j" event={"ID":"afefb0ab-d03e-41d9-81de-ae9a4c9d198f","Type":"ContainerStarted","Data":"70c89ecf50055a594516bef9e0fa1b28ca84ed113d3b1877859854fc9a3a2157"} Mar 13 10:17:31 crc kubenswrapper[4841]: I0313 10:17:31.569007 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gx9j" event={"ID":"afefb0ab-d03e-41d9-81de-ae9a4c9d198f","Type":"ContainerStarted","Data":"ea186336ec035e03c19fca19af52119859f73c8234cf39fc60708cc3d8da48c4"} Mar 13 10:17:33 crc kubenswrapper[4841]: I0313 10:17:33.602334 4841 generic.go:334] "Generic (PLEG): container finished" podID="afefb0ab-d03e-41d9-81de-ae9a4c9d198f" containerID="ea186336ec035e03c19fca19af52119859f73c8234cf39fc60708cc3d8da48c4" exitCode=0 Mar 13 10:17:33 crc kubenswrapper[4841]: I0313 10:17:33.602405 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gx9j" event={"ID":"afefb0ab-d03e-41d9-81de-ae9a4c9d198f","Type":"ContainerDied","Data":"ea186336ec035e03c19fca19af52119859f73c8234cf39fc60708cc3d8da48c4"} Mar 13 10:17:34 crc kubenswrapper[4841]: I0313 10:17:34.613502 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gx9j" event={"ID":"afefb0ab-d03e-41d9-81de-ae9a4c9d198f","Type":"ContainerStarted","Data":"c9e431a7d4586a46b813705249ddc9c5acee35197c0a2991194d3c75facb0a24"} Mar 13 10:17:34 crc kubenswrapper[4841]: I0313 10:17:34.640438 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8gx9j" podStartSLOduration=3.11772895 podStartE2EDuration="6.640414514s" podCreationTimestamp="2026-03-13 10:17:28 +0000 UTC" firstStartedPulling="2026-03-13 10:17:30.557979244 +0000 UTC m=+3933.287879435" lastFinishedPulling="2026-03-13 10:17:34.080664808 +0000 UTC m=+3936.810564999" observedRunningTime="2026-03-13 10:17:34.6338597 +0000 UTC m=+3937.363759901" watchObservedRunningTime="2026-03-13 10:17:34.640414514 +0000 UTC m=+3937.370314715" Mar 13 10:17:39 crc kubenswrapper[4841]: I0313 10:17:39.392720 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8gx9j" Mar 13 10:17:39 crc kubenswrapper[4841]: I0313 10:17:39.393031 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8gx9j" Mar 13 10:17:39 crc kubenswrapper[4841]: I0313 10:17:39.505520 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8gx9j" Mar 13 10:17:39 crc kubenswrapper[4841]: I0313 10:17:39.711943 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8gx9j" Mar 13 10:17:39 crc kubenswrapper[4841]: I0313 10:17:39.761356 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8gx9j"] Mar 13 10:17:39 crc kubenswrapper[4841]: I0313 10:17:39.994689 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:17:39 crc kubenswrapper[4841]: E0313 10:17:39.995255 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:17:41 crc kubenswrapper[4841]: I0313 10:17:41.676527 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8gx9j" podUID="afefb0ab-d03e-41d9-81de-ae9a4c9d198f" containerName="registry-server" containerID="cri-o://c9e431a7d4586a46b813705249ddc9c5acee35197c0a2991194d3c75facb0a24" gracePeriod=2 Mar 13 10:17:42 crc kubenswrapper[4841]: I0313 10:17:42.407676 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8gx9j" Mar 13 10:17:42 crc kubenswrapper[4841]: I0313 10:17:42.555752 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afefb0ab-d03e-41d9-81de-ae9a4c9d198f-catalog-content\") pod \"afefb0ab-d03e-41d9-81de-ae9a4c9d198f\" (UID: \"afefb0ab-d03e-41d9-81de-ae9a4c9d198f\") " Mar 13 10:17:42 crc kubenswrapper[4841]: I0313 10:17:42.555818 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afefb0ab-d03e-41d9-81de-ae9a4c9d198f-utilities\") pod \"afefb0ab-d03e-41d9-81de-ae9a4c9d198f\" (UID: \"afefb0ab-d03e-41d9-81de-ae9a4c9d198f\") " Mar 13 10:17:42 crc kubenswrapper[4841]: I0313 10:17:42.555887 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zbfd\" (UniqueName: \"kubernetes.io/projected/afefb0ab-d03e-41d9-81de-ae9a4c9d198f-kube-api-access-5zbfd\") pod \"afefb0ab-d03e-41d9-81de-ae9a4c9d198f\" (UID: \"afefb0ab-d03e-41d9-81de-ae9a4c9d198f\") " Mar 13 10:17:42 crc kubenswrapper[4841]: I0313 10:17:42.556887 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afefb0ab-d03e-41d9-81de-ae9a4c9d198f-utilities" (OuterVolumeSpecName: "utilities") pod "afefb0ab-d03e-41d9-81de-ae9a4c9d198f" (UID: "afefb0ab-d03e-41d9-81de-ae9a4c9d198f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:17:42 crc kubenswrapper[4841]: I0313 10:17:42.561790 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afefb0ab-d03e-41d9-81de-ae9a4c9d198f-kube-api-access-5zbfd" (OuterVolumeSpecName: "kube-api-access-5zbfd") pod "afefb0ab-d03e-41d9-81de-ae9a4c9d198f" (UID: "afefb0ab-d03e-41d9-81de-ae9a4c9d198f"). InnerVolumeSpecName "kube-api-access-5zbfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:17:42 crc kubenswrapper[4841]: I0313 10:17:42.613277 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afefb0ab-d03e-41d9-81de-ae9a4c9d198f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "afefb0ab-d03e-41d9-81de-ae9a4c9d198f" (UID: "afefb0ab-d03e-41d9-81de-ae9a4c9d198f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:17:42 crc kubenswrapper[4841]: I0313 10:17:42.658059 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afefb0ab-d03e-41d9-81de-ae9a4c9d198f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 10:17:42 crc kubenswrapper[4841]: I0313 10:17:42.658097 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afefb0ab-d03e-41d9-81de-ae9a4c9d198f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 10:17:42 crc kubenswrapper[4841]: I0313 10:17:42.658108 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zbfd\" (UniqueName: \"kubernetes.io/projected/afefb0ab-d03e-41d9-81de-ae9a4c9d198f-kube-api-access-5zbfd\") on node \"crc\" DevicePath \"\"" Mar 13 10:17:42 crc kubenswrapper[4841]: I0313 10:17:42.696445 4841 generic.go:334] "Generic (PLEG): container finished" podID="afefb0ab-d03e-41d9-81de-ae9a4c9d198f" containerID="c9e431a7d4586a46b813705249ddc9c5acee35197c0a2991194d3c75facb0a24" exitCode=0 Mar 13 10:17:42 crc kubenswrapper[4841]: I0313 10:17:42.696490 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gx9j" event={"ID":"afefb0ab-d03e-41d9-81de-ae9a4c9d198f","Type":"ContainerDied","Data":"c9e431a7d4586a46b813705249ddc9c5acee35197c0a2991194d3c75facb0a24"} Mar 13 10:17:42 crc kubenswrapper[4841]: I0313 10:17:42.696524 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8gx9j" Mar 13 10:17:42 crc kubenswrapper[4841]: I0313 10:17:42.696543 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gx9j" event={"ID":"afefb0ab-d03e-41d9-81de-ae9a4c9d198f","Type":"ContainerDied","Data":"70c89ecf50055a594516bef9e0fa1b28ca84ed113d3b1877859854fc9a3a2157"} Mar 13 10:17:42 crc kubenswrapper[4841]: I0313 10:17:42.696565 4841 scope.go:117] "RemoveContainer" containerID="c9e431a7d4586a46b813705249ddc9c5acee35197c0a2991194d3c75facb0a24" Mar 13 10:17:42 crc kubenswrapper[4841]: I0313 10:17:42.722994 4841 scope.go:117] "RemoveContainer" containerID="ea186336ec035e03c19fca19af52119859f73c8234cf39fc60708cc3d8da48c4" Mar 13 10:17:42 crc kubenswrapper[4841]: I0313 10:17:42.743523 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8gx9j"] Mar 13 10:17:42 crc kubenswrapper[4841]: I0313 10:17:42.754657 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8gx9j"] Mar 13 10:17:42 crc kubenswrapper[4841]: E0313 10:17:42.894172 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafefb0ab_d03e_41d9_81de_ae9a4c9d198f.slice/crio-70c89ecf50055a594516bef9e0fa1b28ca84ed113d3b1877859854fc9a3a2157\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafefb0ab_d03e_41d9_81de_ae9a4c9d198f.slice\": RecentStats: unable to find data in memory cache]" Mar 13 10:17:43 crc kubenswrapper[4841]: I0313 10:17:43.207755 4841 scope.go:117] "RemoveContainer" containerID="3feb7a5a3f49fa8c5824a8dc15c3963b51537c5ddad7f223a40fb47684cf3876" Mar 13 10:17:43 crc kubenswrapper[4841]: I0313 10:17:43.272357 4841 scope.go:117] "RemoveContainer" containerID="c9e431a7d4586a46b813705249ddc9c5acee35197c0a2991194d3c75facb0a24" Mar 13 10:17:43 crc kubenswrapper[4841]: E0313 10:17:43.273146 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9e431a7d4586a46b813705249ddc9c5acee35197c0a2991194d3c75facb0a24\": container with ID starting with c9e431a7d4586a46b813705249ddc9c5acee35197c0a2991194d3c75facb0a24 not found: ID does not exist" containerID="c9e431a7d4586a46b813705249ddc9c5acee35197c0a2991194d3c75facb0a24" Mar 13 10:17:43 crc kubenswrapper[4841]: I0313 10:17:43.273175 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9e431a7d4586a46b813705249ddc9c5acee35197c0a2991194d3c75facb0a24"} err="failed to get container status \"c9e431a7d4586a46b813705249ddc9c5acee35197c0a2991194d3c75facb0a24\": rpc error: code = NotFound desc = could not find container \"c9e431a7d4586a46b813705249ddc9c5acee35197c0a2991194d3c75facb0a24\": container with ID starting with c9e431a7d4586a46b813705249ddc9c5acee35197c0a2991194d3c75facb0a24 not found: ID does not exist" Mar 13 10:17:43 crc kubenswrapper[4841]: I0313 10:17:43.273195 4841 scope.go:117] "RemoveContainer" containerID="ea186336ec035e03c19fca19af52119859f73c8234cf39fc60708cc3d8da48c4" Mar 13 10:17:43 crc kubenswrapper[4841]: E0313 10:17:43.273535 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea186336ec035e03c19fca19af52119859f73c8234cf39fc60708cc3d8da48c4\": container with ID starting with ea186336ec035e03c19fca19af52119859f73c8234cf39fc60708cc3d8da48c4 not found: ID does not exist" containerID="ea186336ec035e03c19fca19af52119859f73c8234cf39fc60708cc3d8da48c4" Mar 13 10:17:43 crc kubenswrapper[4841]: I0313 10:17:43.273559 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea186336ec035e03c19fca19af52119859f73c8234cf39fc60708cc3d8da48c4"} err="failed to get container status \"ea186336ec035e03c19fca19af52119859f73c8234cf39fc60708cc3d8da48c4\": rpc error: code = NotFound desc = could not find container \"ea186336ec035e03c19fca19af52119859f73c8234cf39fc60708cc3d8da48c4\": container with ID starting with ea186336ec035e03c19fca19af52119859f73c8234cf39fc60708cc3d8da48c4 not found: ID does not exist" Mar 13 10:17:43 crc kubenswrapper[4841]: I0313 10:17:43.273572 4841 scope.go:117] "RemoveContainer" containerID="3feb7a5a3f49fa8c5824a8dc15c3963b51537c5ddad7f223a40fb47684cf3876" Mar 13 10:17:43 crc kubenswrapper[4841]: E0313 10:17:43.273766 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3feb7a5a3f49fa8c5824a8dc15c3963b51537c5ddad7f223a40fb47684cf3876\": container with ID starting with 3feb7a5a3f49fa8c5824a8dc15c3963b51537c5ddad7f223a40fb47684cf3876 not found: ID does not exist" containerID="3feb7a5a3f49fa8c5824a8dc15c3963b51537c5ddad7f223a40fb47684cf3876" Mar 13 10:17:43 crc kubenswrapper[4841]: I0313 10:17:43.273784 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3feb7a5a3f49fa8c5824a8dc15c3963b51537c5ddad7f223a40fb47684cf3876"} err="failed to get container status \"3feb7a5a3f49fa8c5824a8dc15c3963b51537c5ddad7f223a40fb47684cf3876\": rpc error: code = NotFound desc = could not find container \"3feb7a5a3f49fa8c5824a8dc15c3963b51537c5ddad7f223a40fb47684cf3876\": container with ID starting with 3feb7a5a3f49fa8c5824a8dc15c3963b51537c5ddad7f223a40fb47684cf3876 not found: ID does not exist" Mar 13 10:17:44 crc kubenswrapper[4841]: I0313 10:17:44.013655 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afefb0ab-d03e-41d9-81de-ae9a4c9d198f" path="/var/lib/kubelet/pods/afefb0ab-d03e-41d9-81de-ae9a4c9d198f/volumes" Mar 13 10:17:52 crc kubenswrapper[4841]: I0313 10:17:52.994772 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:17:52 crc kubenswrapper[4841]: E0313 10:17:52.995681 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:17:54 crc kubenswrapper[4841]: I0313 10:17:54.010596 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2g4bv"] Mar 13 10:17:54 crc kubenswrapper[4841]: E0313 10:17:54.011412 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afefb0ab-d03e-41d9-81de-ae9a4c9d198f" containerName="extract-content" Mar 13 10:17:54 crc kubenswrapper[4841]: I0313 10:17:54.011431 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="afefb0ab-d03e-41d9-81de-ae9a4c9d198f" containerName="extract-content" Mar 13 10:17:54 crc kubenswrapper[4841]: E0313 10:17:54.011454 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afefb0ab-d03e-41d9-81de-ae9a4c9d198f" containerName="extract-utilities" Mar 13 10:17:54 crc kubenswrapper[4841]: I0313 10:17:54.011463 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="afefb0ab-d03e-41d9-81de-ae9a4c9d198f" containerName="extract-utilities" Mar 13 10:17:54 crc kubenswrapper[4841]: E0313 10:17:54.011500 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afefb0ab-d03e-41d9-81de-ae9a4c9d198f" containerName="registry-server" Mar 13 10:17:54 crc kubenswrapper[4841]: I0313 10:17:54.011509 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="afefb0ab-d03e-41d9-81de-ae9a4c9d198f" containerName="registry-server" Mar 13 10:17:54 crc kubenswrapper[4841]: I0313 10:17:54.011822 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="afefb0ab-d03e-41d9-81de-ae9a4c9d198f" containerName="registry-server" Mar 13 10:17:54 crc kubenswrapper[4841]: I0313 10:17:54.013855 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g4bv" Mar 13 10:17:54 crc kubenswrapper[4841]: I0313 10:17:54.019371 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2g4bv"] Mar 13 10:17:54 crc kubenswrapper[4841]: I0313 10:17:54.190342 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ffcf7e9-138d-4c80-971b-e4b453a68d18-utilities\") pod \"redhat-operators-2g4bv\" (UID: \"9ffcf7e9-138d-4c80-971b-e4b453a68d18\") " pod="openshift-marketplace/redhat-operators-2g4bv" Mar 13 10:17:54 crc kubenswrapper[4841]: I0313 10:17:54.190697 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vn78\" (UniqueName: \"kubernetes.io/projected/9ffcf7e9-138d-4c80-971b-e4b453a68d18-kube-api-access-7vn78\") pod \"redhat-operators-2g4bv\" (UID: \"9ffcf7e9-138d-4c80-971b-e4b453a68d18\") " pod="openshift-marketplace/redhat-operators-2g4bv" Mar 13 10:17:54 crc kubenswrapper[4841]: I0313 10:17:54.190747 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ffcf7e9-138d-4c80-971b-e4b453a68d18-catalog-content\") pod \"redhat-operators-2g4bv\" (UID: \"9ffcf7e9-138d-4c80-971b-e4b453a68d18\") " pod="openshift-marketplace/redhat-operators-2g4bv" Mar 13 10:17:54 crc kubenswrapper[4841]: I0313 10:17:54.292825 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ffcf7e9-138d-4c80-971b-e4b453a68d18-utilities\") pod \"redhat-operators-2g4bv\" (UID: \"9ffcf7e9-138d-4c80-971b-e4b453a68d18\") " pod="openshift-marketplace/redhat-operators-2g4bv" Mar 13 10:17:54 crc kubenswrapper[4841]: I0313 10:17:54.292895 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vn78\" (UniqueName: \"kubernetes.io/projected/9ffcf7e9-138d-4c80-971b-e4b453a68d18-kube-api-access-7vn78\") pod \"redhat-operators-2g4bv\" (UID: \"9ffcf7e9-138d-4c80-971b-e4b453a68d18\") " pod="openshift-marketplace/redhat-operators-2g4bv" Mar 13 10:17:54 crc kubenswrapper[4841]: I0313 10:17:54.292943 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ffcf7e9-138d-4c80-971b-e4b453a68d18-catalog-content\") pod \"redhat-operators-2g4bv\" (UID: \"9ffcf7e9-138d-4c80-971b-e4b453a68d18\") " pod="openshift-marketplace/redhat-operators-2g4bv" Mar 13 10:17:54 crc kubenswrapper[4841]: I0313 10:17:54.293421 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ffcf7e9-138d-4c80-971b-e4b453a68d18-utilities\") pod \"redhat-operators-2g4bv\" (UID: \"9ffcf7e9-138d-4c80-971b-e4b453a68d18\") " pod="openshift-marketplace/redhat-operators-2g4bv" Mar 13 10:17:54 crc kubenswrapper[4841]: I0313 10:17:54.293497 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ffcf7e9-138d-4c80-971b-e4b453a68d18-catalog-content\") pod \"redhat-operators-2g4bv\" (UID: \"9ffcf7e9-138d-4c80-971b-e4b453a68d18\") " pod="openshift-marketplace/redhat-operators-2g4bv" Mar 13 10:17:54 crc kubenswrapper[4841]: I0313 10:17:54.802149 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vn78\" (UniqueName: \"kubernetes.io/projected/9ffcf7e9-138d-4c80-971b-e4b453a68d18-kube-api-access-7vn78\") pod \"redhat-operators-2g4bv\" (UID: \"9ffcf7e9-138d-4c80-971b-e4b453a68d18\") " pod="openshift-marketplace/redhat-operators-2g4bv" Mar 13 10:17:54 crc kubenswrapper[4841]: I0313 10:17:54.946554 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g4bv" Mar 13 10:17:55 crc kubenswrapper[4841]: W0313 10:17:55.446410 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ffcf7e9_138d_4c80_971b_e4b453a68d18.slice/crio-d9b8731c4485a860822c5d5a8637af086bde8be4e23a4922c48c843ed0534e1a WatchSource:0}: Error finding container d9b8731c4485a860822c5d5a8637af086bde8be4e23a4922c48c843ed0534e1a: Status 404 returned error can't find the container with id d9b8731c4485a860822c5d5a8637af086bde8be4e23a4922c48c843ed0534e1a Mar 13 10:17:55 crc kubenswrapper[4841]: I0313 10:17:55.453078 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2g4bv"] Mar 13 10:17:55 crc kubenswrapper[4841]: I0313 10:17:55.819659 4841 generic.go:334] "Generic (PLEG): container finished" podID="9ffcf7e9-138d-4c80-971b-e4b453a68d18" containerID="20c4f21bd570c0c636d25b842e8e410630b97d90374a8ccffc19588a96baf82b" exitCode=0 Mar 13 10:17:55 crc kubenswrapper[4841]: I0313 10:17:55.819821 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4bv" event={"ID":"9ffcf7e9-138d-4c80-971b-e4b453a68d18","Type":"ContainerDied","Data":"20c4f21bd570c0c636d25b842e8e410630b97d90374a8ccffc19588a96baf82b"} Mar 13 10:17:55 crc kubenswrapper[4841]: I0313 10:17:55.819980 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4bv" event={"ID":"9ffcf7e9-138d-4c80-971b-e4b453a68d18","Type":"ContainerStarted","Data":"d9b8731c4485a860822c5d5a8637af086bde8be4e23a4922c48c843ed0534e1a"} Mar 13 10:17:56 crc kubenswrapper[4841]: I0313 10:17:56.833127 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4bv" event={"ID":"9ffcf7e9-138d-4c80-971b-e4b453a68d18","Type":"ContainerStarted","Data":"7f0069f26c533d4daefe5648cb8e41dfa1621c140b44d8abbe160717ad9d1a78"} Mar 13 10:18:00 crc kubenswrapper[4841]: I0313 10:18:00.151859 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556618-6gln8"] Mar 13 10:18:00 crc kubenswrapper[4841]: I0313 10:18:00.155641 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556618-6gln8" Mar 13 10:18:00 crc kubenswrapper[4841]: I0313 10:18:00.158707 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 10:18:00 crc kubenswrapper[4841]: I0313 10:18:00.160635 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 10:18:00 crc kubenswrapper[4841]: I0313 10:18:00.160777 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 10:18:00 crc kubenswrapper[4841]: I0313 10:18:00.166166 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556618-6gln8"] Mar 13 10:18:00 crc kubenswrapper[4841]: I0313 10:18:00.223936 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77jmq\" (UniqueName: \"kubernetes.io/projected/95c6ef1a-2697-4a89-91d8-8e23761f388a-kube-api-access-77jmq\") pod \"auto-csr-approver-29556618-6gln8\" (UID: \"95c6ef1a-2697-4a89-91d8-8e23761f388a\") " pod="openshift-infra/auto-csr-approver-29556618-6gln8" Mar 13 10:18:00 crc kubenswrapper[4841]: I0313 10:18:00.326089 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77jmq\" (UniqueName: \"kubernetes.io/projected/95c6ef1a-2697-4a89-91d8-8e23761f388a-kube-api-access-77jmq\") pod \"auto-csr-approver-29556618-6gln8\" (UID: \"95c6ef1a-2697-4a89-91d8-8e23761f388a\") " pod="openshift-infra/auto-csr-approver-29556618-6gln8" Mar 13 10:18:00 crc kubenswrapper[4841]: I0313 10:18:00.346197 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77jmq\" (UniqueName: \"kubernetes.io/projected/95c6ef1a-2697-4a89-91d8-8e23761f388a-kube-api-access-77jmq\") pod \"auto-csr-approver-29556618-6gln8\" (UID: \"95c6ef1a-2697-4a89-91d8-8e23761f388a\") " pod="openshift-infra/auto-csr-approver-29556618-6gln8" Mar 13 10:18:00 crc kubenswrapper[4841]: I0313 10:18:00.493382 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556618-6gln8" Mar 13 10:18:02 crc kubenswrapper[4841]: I0313 10:18:02.547085 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556618-6gln8"] Mar 13 10:18:02 crc kubenswrapper[4841]: I0313 10:18:02.555284 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 10:18:03 crc kubenswrapper[4841]: I0313 10:18:03.383440 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556618-6gln8" event={"ID":"95c6ef1a-2697-4a89-91d8-8e23761f388a","Type":"ContainerStarted","Data":"6135897eaf17bdff910635343513f5fc86ca32946d575858bbf66a74ba9a2f7a"} Mar 13 10:18:04 crc kubenswrapper[4841]: I0313 10:18:04.396426 4841 generic.go:334] "Generic (PLEG): container finished" podID="9ffcf7e9-138d-4c80-971b-e4b453a68d18" containerID="7f0069f26c533d4daefe5648cb8e41dfa1621c140b44d8abbe160717ad9d1a78" exitCode=0 Mar 13 10:18:04 crc kubenswrapper[4841]: I0313 10:18:04.396506 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4bv" event={"ID":"9ffcf7e9-138d-4c80-971b-e4b453a68d18","Type":"ContainerDied","Data":"7f0069f26c533d4daefe5648cb8e41dfa1621c140b44d8abbe160717ad9d1a78"} Mar 13 10:18:04 crc kubenswrapper[4841]: I0313 10:18:04.995806 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:18:05 crc kubenswrapper[4841]: I0313 10:18:05.412793 4841 generic.go:334] "Generic (PLEG): container finished" podID="95c6ef1a-2697-4a89-91d8-8e23761f388a" containerID="cc433097b925f45b6e7d151abe992c9429f0a171df09d9aa5269aa6d3e2bcb60" exitCode=0 Mar 13 10:18:05 crc kubenswrapper[4841]: I0313 10:18:05.412883 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556618-6gln8" event={"ID":"95c6ef1a-2697-4a89-91d8-8e23761f388a","Type":"ContainerDied","Data":"cc433097b925f45b6e7d151abe992c9429f0a171df09d9aa5269aa6d3e2bcb60"} Mar 13 10:18:05 crc kubenswrapper[4841]: I0313 10:18:05.418218 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"0205971121b655349d1b80fae5f891f1a0ccfe47f9f479018486b9b62522d2a3"} Mar 13 10:18:05 crc kubenswrapper[4841]: I0313 10:18:05.421857 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4bv" event={"ID":"9ffcf7e9-138d-4c80-971b-e4b453a68d18","Type":"ContainerStarted","Data":"085bdce72f4ac9341b26d72af6e4cc90393af9f637f6fa760f91d09382044848"} Mar 13 10:18:05 crc kubenswrapper[4841]: I0313 10:18:05.443772 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2g4bv" podStartSLOduration=3.241173933 podStartE2EDuration="12.44375188s" podCreationTimestamp="2026-03-13 10:17:53 +0000 UTC" firstStartedPulling="2026-03-13 10:17:55.821577 +0000 UTC m=+3958.551477191" lastFinishedPulling="2026-03-13 10:18:05.024154947 +0000 UTC m=+3967.754055138" observedRunningTime="2026-03-13 10:18:05.44153219 +0000 UTC m=+3968.171432391" watchObservedRunningTime="2026-03-13 10:18:05.44375188 +0000 UTC m=+3968.173652071" Mar 13 10:18:06 crc kubenswrapper[4841]: I0313 10:18:06.836233 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556618-6gln8" Mar 13 10:18:06 crc kubenswrapper[4841]: I0313 10:18:06.869068 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77jmq\" (UniqueName: \"kubernetes.io/projected/95c6ef1a-2697-4a89-91d8-8e23761f388a-kube-api-access-77jmq\") pod \"95c6ef1a-2697-4a89-91d8-8e23761f388a\" (UID: \"95c6ef1a-2697-4a89-91d8-8e23761f388a\") " Mar 13 10:18:06 crc kubenswrapper[4841]: I0313 10:18:06.875153 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c6ef1a-2697-4a89-91d8-8e23761f388a-kube-api-access-77jmq" (OuterVolumeSpecName: "kube-api-access-77jmq") pod "95c6ef1a-2697-4a89-91d8-8e23761f388a" (UID: "95c6ef1a-2697-4a89-91d8-8e23761f388a"). InnerVolumeSpecName "kube-api-access-77jmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:18:06 crc kubenswrapper[4841]: I0313 10:18:06.972079 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77jmq\" (UniqueName: \"kubernetes.io/projected/95c6ef1a-2697-4a89-91d8-8e23761f388a-kube-api-access-77jmq\") on node \"crc\" DevicePath \"\"" Mar 13 10:18:07 crc kubenswrapper[4841]: I0313 10:18:07.461554 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556618-6gln8" event={"ID":"95c6ef1a-2697-4a89-91d8-8e23761f388a","Type":"ContainerDied","Data":"6135897eaf17bdff910635343513f5fc86ca32946d575858bbf66a74ba9a2f7a"} Mar 13 10:18:07 crc kubenswrapper[4841]: I0313 10:18:07.461683 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6135897eaf17bdff910635343513f5fc86ca32946d575858bbf66a74ba9a2f7a" Mar 13 10:18:07 crc kubenswrapper[4841]: I0313 10:18:07.461761 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556618-6gln8" Mar 13 10:18:07 crc kubenswrapper[4841]: I0313 10:18:07.908660 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556612-t87rd"] Mar 13 10:18:07 crc kubenswrapper[4841]: I0313 10:18:07.918953 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556612-t87rd"] Mar 13 10:18:08 crc kubenswrapper[4841]: I0313 10:18:08.005506 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47" path="/var/lib/kubelet/pods/c9c7f87e-f9dd-47d4-ab5a-f6cb551b8a47/volumes" Mar 13 10:18:14 crc kubenswrapper[4841]: I0313 10:18:14.947008 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2g4bv" Mar 13 10:18:14 crc kubenswrapper[4841]: I0313 10:18:14.947552 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2g4bv" Mar 13 10:18:15 crc kubenswrapper[4841]: I0313 10:18:15.000741 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2g4bv" Mar 13 10:18:15 crc kubenswrapper[4841]: I0313 10:18:15.577796 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2g4bv" Mar 13 10:18:15 crc kubenswrapper[4841]: I0313 10:18:15.633087 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2g4bv"] Mar 13 10:18:17 crc kubenswrapper[4841]: I0313 10:18:17.548094 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2g4bv" podUID="9ffcf7e9-138d-4c80-971b-e4b453a68d18" containerName="registry-server" containerID="cri-o://085bdce72f4ac9341b26d72af6e4cc90393af9f637f6fa760f91d09382044848" gracePeriod=2 Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.016286 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g4bv" Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.188341 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vn78\" (UniqueName: \"kubernetes.io/projected/9ffcf7e9-138d-4c80-971b-e4b453a68d18-kube-api-access-7vn78\") pod \"9ffcf7e9-138d-4c80-971b-e4b453a68d18\" (UID: \"9ffcf7e9-138d-4c80-971b-e4b453a68d18\") " Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.188445 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ffcf7e9-138d-4c80-971b-e4b453a68d18-utilities\") pod \"9ffcf7e9-138d-4c80-971b-e4b453a68d18\" (UID: \"9ffcf7e9-138d-4c80-971b-e4b453a68d18\") " Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.188561 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ffcf7e9-138d-4c80-971b-e4b453a68d18-catalog-content\") pod \"9ffcf7e9-138d-4c80-971b-e4b453a68d18\" (UID: \"9ffcf7e9-138d-4c80-971b-e4b453a68d18\") " Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.189396 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ffcf7e9-138d-4c80-971b-e4b453a68d18-utilities" (OuterVolumeSpecName: "utilities") pod "9ffcf7e9-138d-4c80-971b-e4b453a68d18" (UID: "9ffcf7e9-138d-4c80-971b-e4b453a68d18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.221500 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ffcf7e9-138d-4c80-971b-e4b453a68d18-kube-api-access-7vn78" (OuterVolumeSpecName: "kube-api-access-7vn78") pod "9ffcf7e9-138d-4c80-971b-e4b453a68d18" (UID: "9ffcf7e9-138d-4c80-971b-e4b453a68d18"). InnerVolumeSpecName "kube-api-access-7vn78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.290995 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vn78\" (UniqueName: \"kubernetes.io/projected/9ffcf7e9-138d-4c80-971b-e4b453a68d18-kube-api-access-7vn78\") on node \"crc\" DevicePath \"\"" Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.291028 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ffcf7e9-138d-4c80-971b-e4b453a68d18-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.310951 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ffcf7e9-138d-4c80-971b-e4b453a68d18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ffcf7e9-138d-4c80-971b-e4b453a68d18" (UID: "9ffcf7e9-138d-4c80-971b-e4b453a68d18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.393622 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ffcf7e9-138d-4c80-971b-e4b453a68d18-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.559291 4841 generic.go:334] "Generic (PLEG): container finished" podID="9ffcf7e9-138d-4c80-971b-e4b453a68d18" containerID="085bdce72f4ac9341b26d72af6e4cc90393af9f637f6fa760f91d09382044848" exitCode=0 Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.559334 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4bv" event={"ID":"9ffcf7e9-138d-4c80-971b-e4b453a68d18","Type":"ContainerDied","Data":"085bdce72f4ac9341b26d72af6e4cc90393af9f637f6fa760f91d09382044848"} Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.559377 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g4bv" Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.559385 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4bv" event={"ID":"9ffcf7e9-138d-4c80-971b-e4b453a68d18","Type":"ContainerDied","Data":"d9b8731c4485a860822c5d5a8637af086bde8be4e23a4922c48c843ed0534e1a"} Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.559411 4841 scope.go:117] "RemoveContainer" containerID="085bdce72f4ac9341b26d72af6e4cc90393af9f637f6fa760f91d09382044848" Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.588448 4841 scope.go:117] "RemoveContainer" containerID="7f0069f26c533d4daefe5648cb8e41dfa1621c140b44d8abbe160717ad9d1a78" Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.593747 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2g4bv"] Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.604743 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2g4bv"] Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.620645 4841 scope.go:117] "RemoveContainer" containerID="20c4f21bd570c0c636d25b842e8e410630b97d90374a8ccffc19588a96baf82b" Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.658841 4841 scope.go:117] "RemoveContainer" containerID="085bdce72f4ac9341b26d72af6e4cc90393af9f637f6fa760f91d09382044848" Mar 13 10:18:18 crc kubenswrapper[4841]: E0313 10:18:18.659256 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"085bdce72f4ac9341b26d72af6e4cc90393af9f637f6fa760f91d09382044848\": container with ID starting with 085bdce72f4ac9341b26d72af6e4cc90393af9f637f6fa760f91d09382044848 not found: ID does not exist" containerID="085bdce72f4ac9341b26d72af6e4cc90393af9f637f6fa760f91d09382044848" Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.659318 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"085bdce72f4ac9341b26d72af6e4cc90393af9f637f6fa760f91d09382044848"} err="failed to get container status \"085bdce72f4ac9341b26d72af6e4cc90393af9f637f6fa760f91d09382044848\": rpc error: code = NotFound desc = could not find container \"085bdce72f4ac9341b26d72af6e4cc90393af9f637f6fa760f91d09382044848\": container with ID starting with 085bdce72f4ac9341b26d72af6e4cc90393af9f637f6fa760f91d09382044848 not found: ID does not exist" Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.659345 4841 scope.go:117] "RemoveContainer" containerID="7f0069f26c533d4daefe5648cb8e41dfa1621c140b44d8abbe160717ad9d1a78" Mar 13 10:18:18 crc kubenswrapper[4841]: E0313 10:18:18.659721 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f0069f26c533d4daefe5648cb8e41dfa1621c140b44d8abbe160717ad9d1a78\": container with ID starting with 7f0069f26c533d4daefe5648cb8e41dfa1621c140b44d8abbe160717ad9d1a78 not found: ID does not exist" containerID="7f0069f26c533d4daefe5648cb8e41dfa1621c140b44d8abbe160717ad9d1a78" Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.659755 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f0069f26c533d4daefe5648cb8e41dfa1621c140b44d8abbe160717ad9d1a78"} err="failed to get container status \"7f0069f26c533d4daefe5648cb8e41dfa1621c140b44d8abbe160717ad9d1a78\": rpc error: code = NotFound desc = could not find container \"7f0069f26c533d4daefe5648cb8e41dfa1621c140b44d8abbe160717ad9d1a78\": container with ID starting with 7f0069f26c533d4daefe5648cb8e41dfa1621c140b44d8abbe160717ad9d1a78 not found: ID does not exist" Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.659778 4841 scope.go:117] "RemoveContainer" containerID="20c4f21bd570c0c636d25b842e8e410630b97d90374a8ccffc19588a96baf82b" Mar 13 10:18:18 crc kubenswrapper[4841]: E0313 10:18:18.660024 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20c4f21bd570c0c636d25b842e8e410630b97d90374a8ccffc19588a96baf82b\": container with ID starting with 20c4f21bd570c0c636d25b842e8e410630b97d90374a8ccffc19588a96baf82b not found: ID does not exist" containerID="20c4f21bd570c0c636d25b842e8e410630b97d90374a8ccffc19588a96baf82b" Mar 13 10:18:18 crc kubenswrapper[4841]: I0313 10:18:18.660042 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20c4f21bd570c0c636d25b842e8e410630b97d90374a8ccffc19588a96baf82b"} err="failed to get container status \"20c4f21bd570c0c636d25b842e8e410630b97d90374a8ccffc19588a96baf82b\": rpc error: code = NotFound desc = could not find container \"20c4f21bd570c0c636d25b842e8e410630b97d90374a8ccffc19588a96baf82b\": container with ID starting with 20c4f21bd570c0c636d25b842e8e410630b97d90374a8ccffc19588a96baf82b not found: ID does not exist" Mar 13 10:18:20 crc kubenswrapper[4841]: I0313 10:18:20.006092 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ffcf7e9-138d-4c80-971b-e4b453a68d18" path="/var/lib/kubelet/pods/9ffcf7e9-138d-4c80-971b-e4b453a68d18/volumes" Mar 13 10:18:47 crc kubenswrapper[4841]: I0313 10:18:47.905156 4841 generic.go:334] "Generic (PLEG): container finished" podID="b8697567-4038-490d-8525-0ee6f26e6508" containerID="d127eb65df32f43d439864e2c7cac35f67934e6296a4ec66a19b93ba8716a309" exitCode=0 Mar 13 10:18:47 crc kubenswrapper[4841]: I0313 10:18:47.905244 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nq5bz/must-gather-cjjq4" event={"ID":"b8697567-4038-490d-8525-0ee6f26e6508","Type":"ContainerDied","Data":"d127eb65df32f43d439864e2c7cac35f67934e6296a4ec66a19b93ba8716a309"} Mar 13 10:18:47 crc kubenswrapper[4841]: I0313 10:18:47.906610 4841 scope.go:117] "RemoveContainer" containerID="d127eb65df32f43d439864e2c7cac35f67934e6296a4ec66a19b93ba8716a309" Mar 13 10:18:48 crc kubenswrapper[4841]: I0313 10:18:48.000802 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nq5bz_must-gather-cjjq4_b8697567-4038-490d-8525-0ee6f26e6508/gather/0.log" Mar 13 10:18:56 crc kubenswrapper[4841]: I0313 10:18:56.879169 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nq5bz/must-gather-cjjq4"] Mar 13 10:18:56 crc kubenswrapper[4841]: I0313 10:18:56.881089 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-nq5bz/must-gather-cjjq4" podUID="b8697567-4038-490d-8525-0ee6f26e6508" containerName="copy" containerID="cri-o://55c8a845ceffa5b105e41c7504d67fade846ecd0b00e324b258e5ceecec435be" gracePeriod=2 Mar 13 10:18:56 crc kubenswrapper[4841]: I0313 10:18:56.898599 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nq5bz/must-gather-cjjq4"] Mar 13 10:18:57 crc kubenswrapper[4841]: I0313 10:18:57.694419 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nq5bz_must-gather-cjjq4_b8697567-4038-490d-8525-0ee6f26e6508/copy/0.log" Mar 13 10:18:57 crc kubenswrapper[4841]: I0313 10:18:57.695069 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nq5bz/must-gather-cjjq4" Mar 13 10:18:57 crc kubenswrapper[4841]: I0313 10:18:57.851881 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8697567-4038-490d-8525-0ee6f26e6508-must-gather-output\") pod \"b8697567-4038-490d-8525-0ee6f26e6508\" (UID: \"b8697567-4038-490d-8525-0ee6f26e6508\") " Mar 13 10:18:57 crc kubenswrapper[4841]: I0313 10:18:57.851955 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8jb2\" (UniqueName: \"kubernetes.io/projected/b8697567-4038-490d-8525-0ee6f26e6508-kube-api-access-m8jb2\") pod \"b8697567-4038-490d-8525-0ee6f26e6508\" (UID: \"b8697567-4038-490d-8525-0ee6f26e6508\") " Mar 13 10:18:57 crc kubenswrapper[4841]: I0313 10:18:57.857148 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8697567-4038-490d-8525-0ee6f26e6508-kube-api-access-m8jb2" (OuterVolumeSpecName: "kube-api-access-m8jb2") pod "b8697567-4038-490d-8525-0ee6f26e6508" (UID: "b8697567-4038-490d-8525-0ee6f26e6508"). InnerVolumeSpecName "kube-api-access-m8jb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:18:57 crc kubenswrapper[4841]: I0313 10:18:57.954184 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8jb2\" (UniqueName: \"kubernetes.io/projected/b8697567-4038-490d-8525-0ee6f26e6508-kube-api-access-m8jb2\") on node \"crc\" DevicePath \"\"" Mar 13 10:18:58 crc kubenswrapper[4841]: I0313 10:18:58.003110 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nq5bz_must-gather-cjjq4_b8697567-4038-490d-8525-0ee6f26e6508/copy/0.log" Mar 13 10:18:58 crc kubenswrapper[4841]: I0313 10:18:58.003696 4841 generic.go:334] "Generic (PLEG): container finished" podID="b8697567-4038-490d-8525-0ee6f26e6508" containerID="55c8a845ceffa5b105e41c7504d67fade846ecd0b00e324b258e5ceecec435be" exitCode=143 Mar 13 10:18:58 crc kubenswrapper[4841]: I0313 10:18:58.004189 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nq5bz/must-gather-cjjq4" Mar 13 10:18:58 crc kubenswrapper[4841]: I0313 10:18:58.015634 4841 scope.go:117] "RemoveContainer" containerID="55c8a845ceffa5b105e41c7504d67fade846ecd0b00e324b258e5ceecec435be" Mar 13 10:18:58 crc kubenswrapper[4841]: I0313 10:18:58.040786 4841 scope.go:117] "RemoveContainer" containerID="d127eb65df32f43d439864e2c7cac35f67934e6296a4ec66a19b93ba8716a309" Mar 13 10:18:58 crc kubenswrapper[4841]: I0313 10:18:58.042551 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8697567-4038-490d-8525-0ee6f26e6508-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b8697567-4038-490d-8525-0ee6f26e6508" (UID: "b8697567-4038-490d-8525-0ee6f26e6508"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:18:58 crc kubenswrapper[4841]: I0313 10:18:58.056220 4841 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8697567-4038-490d-8525-0ee6f26e6508-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 13 10:18:58 crc kubenswrapper[4841]: I0313 10:18:58.078177 4841 scope.go:117] "RemoveContainer" containerID="55c8a845ceffa5b105e41c7504d67fade846ecd0b00e324b258e5ceecec435be" Mar 13 10:18:58 crc kubenswrapper[4841]: E0313 10:18:58.081042 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55c8a845ceffa5b105e41c7504d67fade846ecd0b00e324b258e5ceecec435be\": container with ID starting with 55c8a845ceffa5b105e41c7504d67fade846ecd0b00e324b258e5ceecec435be not found: ID does not exist" containerID="55c8a845ceffa5b105e41c7504d67fade846ecd0b00e324b258e5ceecec435be" Mar 13 10:18:58 crc kubenswrapper[4841]: I0313 10:18:58.081085 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55c8a845ceffa5b105e41c7504d67fade846ecd0b00e324b258e5ceecec435be"} err="failed to get container status \"55c8a845ceffa5b105e41c7504d67fade846ecd0b00e324b258e5ceecec435be\": rpc error: code = NotFound desc = could not find container \"55c8a845ceffa5b105e41c7504d67fade846ecd0b00e324b258e5ceecec435be\": container with ID starting with 55c8a845ceffa5b105e41c7504d67fade846ecd0b00e324b258e5ceecec435be not found: ID does not exist" Mar 13 10:18:58 crc kubenswrapper[4841]: I0313 10:18:58.081111 4841 scope.go:117] "RemoveContainer" containerID="d127eb65df32f43d439864e2c7cac35f67934e6296a4ec66a19b93ba8716a309" Mar 13 10:18:58 crc kubenswrapper[4841]: E0313 10:18:58.081423 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d127eb65df32f43d439864e2c7cac35f67934e6296a4ec66a19b93ba8716a309\": container with ID starting with d127eb65df32f43d439864e2c7cac35f67934e6296a4ec66a19b93ba8716a309 not found: ID does not exist" containerID="d127eb65df32f43d439864e2c7cac35f67934e6296a4ec66a19b93ba8716a309" Mar 13 10:18:58 crc kubenswrapper[4841]: I0313 10:18:58.081452 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d127eb65df32f43d439864e2c7cac35f67934e6296a4ec66a19b93ba8716a309"} err="failed to get container status \"d127eb65df32f43d439864e2c7cac35f67934e6296a4ec66a19b93ba8716a309\": rpc error: code = NotFound desc = could not find container \"d127eb65df32f43d439864e2c7cac35f67934e6296a4ec66a19b93ba8716a309\": container with ID starting with d127eb65df32f43d439864e2c7cac35f67934e6296a4ec66a19b93ba8716a309 not found: ID does not exist" Mar 13 10:19:00 crc kubenswrapper[4841]: I0313 10:19:00.004607 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8697567-4038-490d-8525-0ee6f26e6508" path="/var/lib/kubelet/pods/b8697567-4038-490d-8525-0ee6f26e6508/volumes" Mar 13 10:19:06 crc kubenswrapper[4841]: I0313 10:19:06.955766 4841 scope.go:117] "RemoveContainer" containerID="1be61b2c7846f52068adff0f29788801ce91efa26ab211270145532850a99392" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.160479 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556620-g729x"] Mar 13 10:20:00 crc kubenswrapper[4841]: E0313 10:20:00.161464 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8697567-4038-490d-8525-0ee6f26e6508" containerName="copy" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.161476 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8697567-4038-490d-8525-0ee6f26e6508" containerName="copy" Mar 13 10:20:00 crc kubenswrapper[4841]: E0313 10:20:00.161490 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ffcf7e9-138d-4c80-971b-e4b453a68d18" containerName="extract-content" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.161496 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffcf7e9-138d-4c80-971b-e4b453a68d18" containerName="extract-content" Mar 13 10:20:00 crc kubenswrapper[4841]: E0313 10:20:00.161507 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8697567-4038-490d-8525-0ee6f26e6508" containerName="gather" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.161513 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8697567-4038-490d-8525-0ee6f26e6508" containerName="gather" Mar 13 10:20:00 crc kubenswrapper[4841]: E0313 10:20:00.161525 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ffcf7e9-138d-4c80-971b-e4b453a68d18" containerName="extract-utilities" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.161532 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffcf7e9-138d-4c80-971b-e4b453a68d18" containerName="extract-utilities" Mar 13 10:20:00 crc kubenswrapper[4841]: E0313 10:20:00.161546 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ffcf7e9-138d-4c80-971b-e4b453a68d18" containerName="registry-server" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.161553 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffcf7e9-138d-4c80-971b-e4b453a68d18" containerName="registry-server" Mar 13 10:20:00 crc kubenswrapper[4841]: E0313 10:20:00.161572 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c6ef1a-2697-4a89-91d8-8e23761f388a" containerName="oc" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.161580 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c6ef1a-2697-4a89-91d8-8e23761f388a" containerName="oc" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.161773 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ffcf7e9-138d-4c80-971b-e4b453a68d18" containerName="registry-server" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.161793 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c6ef1a-2697-4a89-91d8-8e23761f388a" containerName="oc" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.161808 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8697567-4038-490d-8525-0ee6f26e6508" containerName="gather" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.161822 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8697567-4038-490d-8525-0ee6f26e6508" containerName="copy" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.162532 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556620-g729x" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.165185 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.165400 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.165815 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.171502 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556620-g729x"] Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.281224 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zkkp\" (UniqueName: \"kubernetes.io/projected/a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3-kube-api-access-9zkkp\") pod \"auto-csr-approver-29556620-g729x\" (UID: \"a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3\") " pod="openshift-infra/auto-csr-approver-29556620-g729x" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.383036 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zkkp\" (UniqueName: \"kubernetes.io/projected/a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3-kube-api-access-9zkkp\") pod \"auto-csr-approver-29556620-g729x\" (UID: \"a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3\") " pod="openshift-infra/auto-csr-approver-29556620-g729x" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.405385 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zkkp\" (UniqueName: \"kubernetes.io/projected/a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3-kube-api-access-9zkkp\") pod \"auto-csr-approver-29556620-g729x\" (UID: \"a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3\") " pod="openshift-infra/auto-csr-approver-29556620-g729x" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.510710 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556620-g729x" Mar 13 10:20:00 crc kubenswrapper[4841]: I0313 10:20:00.944162 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556620-g729x"] Mar 13 10:20:01 crc kubenswrapper[4841]: I0313 10:20:01.867218 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556620-g729x" event={"ID":"a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3","Type":"ContainerStarted","Data":"7085c7e319b364c645eb1fb43c46fb02409150fdcacdbc911ca66e5abfbd7142"} Mar 13 10:20:02 crc kubenswrapper[4841]: I0313 10:20:02.877299 4841 generic.go:334] "Generic (PLEG): container finished" podID="a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3" containerID="5f755d7dae9fc35c478ecf31122c16c56fe2f6447f72fea0463d1ca63d232141" exitCode=0 Mar 13 10:20:02 crc kubenswrapper[4841]: I0313 10:20:02.877365 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556620-g729x" event={"ID":"a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3","Type":"ContainerDied","Data":"5f755d7dae9fc35c478ecf31122c16c56fe2f6447f72fea0463d1ca63d232141"} Mar 13 10:20:04 crc kubenswrapper[4841]: I0313 10:20:04.257998 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556620-g729x" Mar 13 10:20:04 crc kubenswrapper[4841]: I0313 10:20:04.364033 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zkkp\" (UniqueName: \"kubernetes.io/projected/a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3-kube-api-access-9zkkp\") pod \"a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3\" (UID: \"a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3\") " Mar 13 10:20:04 crc kubenswrapper[4841]: I0313 10:20:04.369531 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3-kube-api-access-9zkkp" (OuterVolumeSpecName: "kube-api-access-9zkkp") pod "a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3" (UID: "a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3"). InnerVolumeSpecName "kube-api-access-9zkkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:20:04 crc kubenswrapper[4841]: I0313 10:20:04.467814 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zkkp\" (UniqueName: \"kubernetes.io/projected/a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3-kube-api-access-9zkkp\") on node \"crc\" DevicePath \"\"" Mar 13 10:20:04 crc kubenswrapper[4841]: I0313 10:20:04.911527 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556620-g729x" event={"ID":"a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3","Type":"ContainerDied","Data":"7085c7e319b364c645eb1fb43c46fb02409150fdcacdbc911ca66e5abfbd7142"} Mar 13 10:20:04 crc kubenswrapper[4841]: I0313 10:20:04.911879 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7085c7e319b364c645eb1fb43c46fb02409150fdcacdbc911ca66e5abfbd7142" Mar 13 10:20:04 crc kubenswrapper[4841]: I0313 10:20:04.911593 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556620-g729x" Mar 13 10:20:05 crc kubenswrapper[4841]: I0313 10:20:05.327582 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556614-2z7sq"] Mar 13 10:20:05 crc kubenswrapper[4841]: I0313 10:20:05.339136 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556614-2z7sq"] Mar 13 10:20:06 crc kubenswrapper[4841]: I0313 10:20:06.006209 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca4f7dc4-cc42-48c5-8ec3-f925c6190002" path="/var/lib/kubelet/pods/ca4f7dc4-cc42-48c5-8ec3-f925c6190002/volumes" Mar 13 10:20:07 crc kubenswrapper[4841]: I0313 10:20:07.054728 4841 scope.go:117] "RemoveContainer" containerID="9457c7084f31cda0146952e3f847dfda1d760a343f79055b2d4e48a81e85ddd9" Mar 13 10:20:07 crc kubenswrapper[4841]: I0313 10:20:07.079170 4841 scope.go:117] "RemoveContainer" containerID="fcb890ffbb4e0759af3fe2beca03965ea4ebbe0bed4873f68f37cbcb7df821dc" Mar 13 10:20:34 crc kubenswrapper[4841]: I0313 10:20:34.407334 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 10:20:34 crc kubenswrapper[4841]: I0313 10:20:34.407903 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 10:21:04 crc kubenswrapper[4841]: I0313 10:21:04.408161 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 10:21:04 crc kubenswrapper[4841]: I0313 10:21:04.408820 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 10:21:34 crc kubenswrapper[4841]: I0313 10:21:34.406978 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 10:21:34 crc kubenswrapper[4841]: I0313 10:21:34.407490 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 10:21:34 crc kubenswrapper[4841]: I0313 10:21:34.407535 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 10:21:34 crc kubenswrapper[4841]: I0313 10:21:34.408235 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0205971121b655349d1b80fae5f891f1a0ccfe47f9f479018486b9b62522d2a3"} pod="openshift-machine-config-operator/machine-config-daemon-h227v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 10:21:34 crc kubenswrapper[4841]: I0313 10:21:34.408321 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" containerID="cri-o://0205971121b655349d1b80fae5f891f1a0ccfe47f9f479018486b9b62522d2a3" gracePeriod=600 Mar 13 10:21:34 crc kubenswrapper[4841]: I0313 10:21:34.789656 4841 generic.go:334] "Generic (PLEG): container finished" podID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerID="0205971121b655349d1b80fae5f891f1a0ccfe47f9f479018486b9b62522d2a3" exitCode=0 Mar 13 10:21:34 crc kubenswrapper[4841]: I0313 10:21:34.789750 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerDied","Data":"0205971121b655349d1b80fae5f891f1a0ccfe47f9f479018486b9b62522d2a3"} Mar 13 10:21:34 crc kubenswrapper[4841]: I0313 10:21:34.790005 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1"} Mar 13 10:21:34 crc kubenswrapper[4841]: I0313 10:21:34.790028 4841 scope.go:117] "RemoveContainer" containerID="a4ec04dd3b675e11a379900f765ef7affbaecb8db19d0e676ed4e7b625920912" Mar 13 10:21:51 crc kubenswrapper[4841]: I0313 10:21:51.120563 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v7ws7"] Mar 13 10:21:51 crc kubenswrapper[4841]: E0313 10:21:51.121647 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3" containerName="oc" Mar 13 10:21:51 crc kubenswrapper[4841]: I0313 10:21:51.121663 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3" containerName="oc" Mar 13 10:21:51 crc kubenswrapper[4841]: I0313 10:21:51.121941 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3" containerName="oc" Mar 13 10:21:51 crc kubenswrapper[4841]: I0313 10:21:51.123624 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7ws7" Mar 13 10:21:51 crc kubenswrapper[4841]: I0313 10:21:51.161842 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v7ws7"] Mar 13 10:21:51 crc kubenswrapper[4841]: I0313 10:21:51.266969 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79118a9f-3583-4c4f-b677-6c1a0aa7d426-utilities\") pod \"certified-operators-v7ws7\" (UID: \"79118a9f-3583-4c4f-b677-6c1a0aa7d426\") " pod="openshift-marketplace/certified-operators-v7ws7" Mar 13 10:21:51 crc kubenswrapper[4841]: I0313 10:21:51.267041 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79118a9f-3583-4c4f-b677-6c1a0aa7d426-catalog-content\") pod \"certified-operators-v7ws7\" (UID: \"79118a9f-3583-4c4f-b677-6c1a0aa7d426\") " pod="openshift-marketplace/certified-operators-v7ws7" Mar 13 10:21:51 crc kubenswrapper[4841]: I0313 10:21:51.267235 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw9k8\" (UniqueName: \"kubernetes.io/projected/79118a9f-3583-4c4f-b677-6c1a0aa7d426-kube-api-access-bw9k8\") pod \"certified-operators-v7ws7\" (UID: \"79118a9f-3583-4c4f-b677-6c1a0aa7d426\") " pod="openshift-marketplace/certified-operators-v7ws7" Mar 13 10:21:51 crc kubenswrapper[4841]: I0313 10:21:51.369142 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79118a9f-3583-4c4f-b677-6c1a0aa7d426-catalog-content\") pod \"certified-operators-v7ws7\" (UID: \"79118a9f-3583-4c4f-b677-6c1a0aa7d426\") " pod="openshift-marketplace/certified-operators-v7ws7" Mar 13 10:21:51 crc kubenswrapper[4841]: I0313 10:21:51.369325 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw9k8\" (UniqueName: \"kubernetes.io/projected/79118a9f-3583-4c4f-b677-6c1a0aa7d426-kube-api-access-bw9k8\") pod \"certified-operators-v7ws7\" (UID: \"79118a9f-3583-4c4f-b677-6c1a0aa7d426\") " pod="openshift-marketplace/certified-operators-v7ws7" Mar 13 10:21:51 crc kubenswrapper[4841]: I0313 10:21:51.369355 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79118a9f-3583-4c4f-b677-6c1a0aa7d426-utilities\") pod \"certified-operators-v7ws7\" (UID: \"79118a9f-3583-4c4f-b677-6c1a0aa7d426\") " pod="openshift-marketplace/certified-operators-v7ws7" Mar 13 10:21:51 crc kubenswrapper[4841]: I0313 10:21:51.369648 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79118a9f-3583-4c4f-b677-6c1a0aa7d426-catalog-content\") pod \"certified-operators-v7ws7\" (UID: \"79118a9f-3583-4c4f-b677-6c1a0aa7d426\") " pod="openshift-marketplace/certified-operators-v7ws7" Mar 13 10:21:51 crc kubenswrapper[4841]: I0313 10:21:51.369720 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79118a9f-3583-4c4f-b677-6c1a0aa7d426-utilities\") pod \"certified-operators-v7ws7\" (UID: \"79118a9f-3583-4c4f-b677-6c1a0aa7d426\") " pod="openshift-marketplace/certified-operators-v7ws7" Mar 13 10:21:51 crc kubenswrapper[4841]: I0313 10:21:51.394629 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw9k8\" (UniqueName: \"kubernetes.io/projected/79118a9f-3583-4c4f-b677-6c1a0aa7d426-kube-api-access-bw9k8\") pod \"certified-operators-v7ws7\" (UID: \"79118a9f-3583-4c4f-b677-6c1a0aa7d426\") " pod="openshift-marketplace/certified-operators-v7ws7" Mar 13 10:21:51 crc kubenswrapper[4841]: I0313 10:21:51.464655 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7ws7" Mar 13 10:21:51 crc kubenswrapper[4841]: I0313 10:21:51.960122 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v7ws7"] Mar 13 10:21:52 crc kubenswrapper[4841]: I0313 10:21:52.504001 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pzrth"] Mar 13 10:21:52 crc kubenswrapper[4841]: I0313 10:21:52.507116 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzrth" Mar 13 10:21:52 crc kubenswrapper[4841]: I0313 10:21:52.516049 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzrth"] Mar 13 10:21:52 crc kubenswrapper[4841]: I0313 10:21:52.602729 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b6612a7-40b6-48db-bde8-95f80c4aead5-utilities\") pod \"redhat-marketplace-pzrth\" (UID: \"2b6612a7-40b6-48db-bde8-95f80c4aead5\") " pod="openshift-marketplace/redhat-marketplace-pzrth" Mar 13 10:21:52 crc kubenswrapper[4841]: I0313 10:21:52.602790 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b6612a7-40b6-48db-bde8-95f80c4aead5-catalog-content\") pod \"redhat-marketplace-pzrth\" (UID: \"2b6612a7-40b6-48db-bde8-95f80c4aead5\") " pod="openshift-marketplace/redhat-marketplace-pzrth" Mar 13 10:21:52 crc kubenswrapper[4841]: I0313 10:21:52.602953 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wztb\" (UniqueName: \"kubernetes.io/projected/2b6612a7-40b6-48db-bde8-95f80c4aead5-kube-api-access-4wztb\") pod \"redhat-marketplace-pzrth\" (UID: \"2b6612a7-40b6-48db-bde8-95f80c4aead5\") " pod="openshift-marketplace/redhat-marketplace-pzrth" Mar 13 10:21:52 crc kubenswrapper[4841]: I0313 10:21:52.704323 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b6612a7-40b6-48db-bde8-95f80c4aead5-utilities\") pod \"redhat-marketplace-pzrth\" (UID: \"2b6612a7-40b6-48db-bde8-95f80c4aead5\") " pod="openshift-marketplace/redhat-marketplace-pzrth" Mar 13 10:21:52 crc kubenswrapper[4841]: I0313 10:21:52.704575 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b6612a7-40b6-48db-bde8-95f80c4aead5-catalog-content\") pod \"redhat-marketplace-pzrth\" (UID: \"2b6612a7-40b6-48db-bde8-95f80c4aead5\") " pod="openshift-marketplace/redhat-marketplace-pzrth" Mar 13 10:21:52 crc kubenswrapper[4841]: I0313 10:21:52.704683 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wztb\" (UniqueName: \"kubernetes.io/projected/2b6612a7-40b6-48db-bde8-95f80c4aead5-kube-api-access-4wztb\") pod \"redhat-marketplace-pzrth\" (UID: \"2b6612a7-40b6-48db-bde8-95f80c4aead5\") " pod="openshift-marketplace/redhat-marketplace-pzrth" Mar 13 10:21:52 crc kubenswrapper[4841]: I0313 10:21:52.704829 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b6612a7-40b6-48db-bde8-95f80c4aead5-utilities\") pod \"redhat-marketplace-pzrth\" (UID: \"2b6612a7-40b6-48db-bde8-95f80c4aead5\") " pod="openshift-marketplace/redhat-marketplace-pzrth" Mar 13 10:21:52 crc kubenswrapper[4841]: I0313 10:21:52.705251 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b6612a7-40b6-48db-bde8-95f80c4aead5-catalog-content\") pod \"redhat-marketplace-pzrth\" (UID: \"2b6612a7-40b6-48db-bde8-95f80c4aead5\") " pod="openshift-marketplace/redhat-marketplace-pzrth" Mar 13 10:21:52 crc kubenswrapper[4841]: I0313 10:21:52.727963 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wztb\" (UniqueName: \"kubernetes.io/projected/2b6612a7-40b6-48db-bde8-95f80c4aead5-kube-api-access-4wztb\") pod \"redhat-marketplace-pzrth\" (UID: \"2b6612a7-40b6-48db-bde8-95f80c4aead5\") " pod="openshift-marketplace/redhat-marketplace-pzrth" Mar 13 10:21:52 crc kubenswrapper[4841]: I0313 10:21:52.825812 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzrth" Mar 13 10:21:52 crc kubenswrapper[4841]: I0313 10:21:52.972546 4841 generic.go:334] "Generic (PLEG): container finished" podID="79118a9f-3583-4c4f-b677-6c1a0aa7d426" containerID="9766b088764acffac82963e398e1075507f2d509478dc4dfef093617e77a2cde" exitCode=0 Mar 13 10:21:52 crc kubenswrapper[4841]: I0313 10:21:52.973075 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7ws7" event={"ID":"79118a9f-3583-4c4f-b677-6c1a0aa7d426","Type":"ContainerDied","Data":"9766b088764acffac82963e398e1075507f2d509478dc4dfef093617e77a2cde"} Mar 13 10:21:52 crc kubenswrapper[4841]: I0313 10:21:52.973108 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7ws7" event={"ID":"79118a9f-3583-4c4f-b677-6c1a0aa7d426","Type":"ContainerStarted","Data":"28a172c8aff8a08f07b7ba972d2e48f71b2c698d42762e8c19e3b3400619a072"} Mar 13 10:21:53 crc kubenswrapper[4841]: I0313 10:21:53.325801 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzrth"] Mar 13 10:21:53 crc kubenswrapper[4841]: I0313 10:21:53.984679 4841 generic.go:334] "Generic (PLEG): container finished" podID="2b6612a7-40b6-48db-bde8-95f80c4aead5" containerID="86c27384ec9ccd9a40fdd3760600d1291db54a082d7140d11f967fb686d8a935" exitCode=0 Mar 13 10:21:53 crc kubenswrapper[4841]: I0313 10:21:53.984740 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzrth" event={"ID":"2b6612a7-40b6-48db-bde8-95f80c4aead5","Type":"ContainerDied","Data":"86c27384ec9ccd9a40fdd3760600d1291db54a082d7140d11f967fb686d8a935"} Mar 13 10:21:53 crc kubenswrapper[4841]: I0313 10:21:53.985059 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzrth" event={"ID":"2b6612a7-40b6-48db-bde8-95f80c4aead5","Type":"ContainerStarted","Data":"e12a1d531798100f342f24943192b22495aaf1b81d442bef6d22e6349f494d9e"} Mar 13 10:21:54 crc kubenswrapper[4841]: I0313 10:21:54.998397 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzrth" event={"ID":"2b6612a7-40b6-48db-bde8-95f80c4aead5","Type":"ContainerStarted","Data":"a06bc05474b3b84b195f20ac4a1fdf56006731168d5e7192ef0f2debcb3064fd"} Mar 13 10:21:55 crc kubenswrapper[4841]: I0313 10:21:55.000278 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7ws7" event={"ID":"79118a9f-3583-4c4f-b677-6c1a0aa7d426","Type":"ContainerStarted","Data":"c654eaa68711914551fc7980d2a716b7757460d8e116871cf44229652d12ddfb"} Mar 13 10:21:56 crc kubenswrapper[4841]: I0313 10:21:56.009752 4841 generic.go:334] "Generic (PLEG): container finished" podID="2b6612a7-40b6-48db-bde8-95f80c4aead5" containerID="a06bc05474b3b84b195f20ac4a1fdf56006731168d5e7192ef0f2debcb3064fd" exitCode=0 Mar 13 10:21:56 crc kubenswrapper[4841]: I0313 10:21:56.009842 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzrth" event={"ID":"2b6612a7-40b6-48db-bde8-95f80c4aead5","Type":"ContainerDied","Data":"a06bc05474b3b84b195f20ac4a1fdf56006731168d5e7192ef0f2debcb3064fd"} Mar 13 10:21:56 crc kubenswrapper[4841]: I0313 10:21:56.014162 4841 generic.go:334] "Generic (PLEG): container finished" podID="79118a9f-3583-4c4f-b677-6c1a0aa7d426" containerID="c654eaa68711914551fc7980d2a716b7757460d8e116871cf44229652d12ddfb" exitCode=0 Mar 13 10:21:56 crc kubenswrapper[4841]: I0313 10:21:56.014198 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7ws7" event={"ID":"79118a9f-3583-4c4f-b677-6c1a0aa7d426","Type":"ContainerDied","Data":"c654eaa68711914551fc7980d2a716b7757460d8e116871cf44229652d12ddfb"} Mar 13 10:21:57 crc kubenswrapper[4841]: I0313 10:21:57.025033 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzrth" event={"ID":"2b6612a7-40b6-48db-bde8-95f80c4aead5","Type":"ContainerStarted","Data":"aecb762ca974b48e017773993953c236ee61a9c8963e46de2336815f76dc5e8c"} Mar 13 10:21:57 crc kubenswrapper[4841]: I0313 10:21:57.027129 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7ws7" event={"ID":"79118a9f-3583-4c4f-b677-6c1a0aa7d426","Type":"ContainerStarted","Data":"db7167e36449b0b082f5e777458568810e0b7309d023c108ea1fdbdefadf4d80"} Mar 13 10:21:57 crc kubenswrapper[4841]: I0313 10:21:57.045561 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pzrth" podStartSLOduration=2.613856171 podStartE2EDuration="5.045543241s" podCreationTimestamp="2026-03-13 10:21:52 +0000 UTC" firstStartedPulling="2026-03-13 10:21:53.986326946 +0000 UTC m=+4196.716227137" lastFinishedPulling="2026-03-13 10:21:56.418014016 +0000 UTC m=+4199.147914207" observedRunningTime="2026-03-13 10:21:57.041304048 +0000 UTC m=+4199.771204249" watchObservedRunningTime="2026-03-13 10:21:57.045543241 +0000 UTC m=+4199.775443432" Mar 13 10:21:57 crc kubenswrapper[4841]: I0313 10:21:57.060493 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v7ws7" podStartSLOduration=2.617468178 podStartE2EDuration="6.060468388s" podCreationTimestamp="2026-03-13 10:21:51 +0000 UTC" firstStartedPulling="2026-03-13 10:21:52.98024257 +0000 UTC m=+4195.710142761" lastFinishedPulling="2026-03-13 10:21:56.42324278 +0000 UTC m=+4199.153142971" observedRunningTime="2026-03-13 10:21:57.055433331 +0000 UTC m=+4199.785333522" watchObservedRunningTime="2026-03-13 10:21:57.060468388 +0000 UTC m=+4199.790368589" Mar 13 10:22:00 crc kubenswrapper[4841]: I0313 10:22:00.149501 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556622-8vj8d"] Mar 13 10:22:00 crc kubenswrapper[4841]: I0313 10:22:00.151436 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556622-8vj8d" Mar 13 10:22:00 crc kubenswrapper[4841]: I0313 10:22:00.156760 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 10:22:00 crc kubenswrapper[4841]: I0313 10:22:00.156889 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 10:22:00 crc kubenswrapper[4841]: I0313 10:22:00.156986 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 10:22:00 crc kubenswrapper[4841]: I0313 10:22:00.161797 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556622-8vj8d"] Mar 13 10:22:00 crc kubenswrapper[4841]: I0313 10:22:00.254298 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqv6d\" (UniqueName: \"kubernetes.io/projected/d0ee97d0-aee2-40fc-96a7-d022734397b2-kube-api-access-cqv6d\") pod \"auto-csr-approver-29556622-8vj8d\" (UID: \"d0ee97d0-aee2-40fc-96a7-d022734397b2\") " pod="openshift-infra/auto-csr-approver-29556622-8vj8d" Mar 13 10:22:00 crc kubenswrapper[4841]: I0313 10:22:00.356680 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqv6d\" (UniqueName: \"kubernetes.io/projected/d0ee97d0-aee2-40fc-96a7-d022734397b2-kube-api-access-cqv6d\") pod \"auto-csr-approver-29556622-8vj8d\" (UID: \"d0ee97d0-aee2-40fc-96a7-d022734397b2\") " pod="openshift-infra/auto-csr-approver-29556622-8vj8d" Mar 13 10:22:00 crc kubenswrapper[4841]: I0313 10:22:00.378287 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqv6d\" (UniqueName: \"kubernetes.io/projected/d0ee97d0-aee2-40fc-96a7-d022734397b2-kube-api-access-cqv6d\") pod \"auto-csr-approver-29556622-8vj8d\" (UID: \"d0ee97d0-aee2-40fc-96a7-d022734397b2\") " pod="openshift-infra/auto-csr-approver-29556622-8vj8d" Mar 13 10:22:00 crc kubenswrapper[4841]: I0313 10:22:00.476319 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556622-8vj8d" Mar 13 10:22:01 crc kubenswrapper[4841]: I0313 10:22:01.327000 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556622-8vj8d"] Mar 13 10:22:01 crc kubenswrapper[4841]: I0313 10:22:01.465862 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v7ws7" Mar 13 10:22:01 crc kubenswrapper[4841]: I0313 10:22:01.465904 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v7ws7" Mar 13 10:22:01 crc kubenswrapper[4841]: I0313 10:22:01.512989 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v7ws7" Mar 13 10:22:02 crc kubenswrapper[4841]: I0313 10:22:02.362354 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556622-8vj8d" event={"ID":"d0ee97d0-aee2-40fc-96a7-d022734397b2","Type":"ContainerStarted","Data":"76a0bb443f86b078da7f53ee106b4408ae281546a991c44bdda8dfd19cd4a909"} Mar 13 10:22:02 crc kubenswrapper[4841]: I0313 10:22:02.734702 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v7ws7" Mar 13 10:22:02 crc kubenswrapper[4841]: I0313 10:22:02.788737 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v7ws7"] Mar 13 10:22:02 crc kubenswrapper[4841]: I0313 10:22:02.826583 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pzrth" Mar 13 10:22:02 crc kubenswrapper[4841]: I0313 10:22:02.826625 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pzrth" Mar 13 10:22:02 crc kubenswrapper[4841]: I0313 10:22:02.887871 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pzrth" Mar 13 10:22:03 crc kubenswrapper[4841]: I0313 10:22:03.369722 4841 generic.go:334] "Generic (PLEG): container finished" podID="d0ee97d0-aee2-40fc-96a7-d022734397b2" containerID="864c8629ae70baf711d32146407ec5996c1ef6a331d65ed00083b44609ca35de" exitCode=0 Mar 13 10:22:03 crc kubenswrapper[4841]: I0313 10:22:03.369826 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556622-8vj8d" event={"ID":"d0ee97d0-aee2-40fc-96a7-d022734397b2","Type":"ContainerDied","Data":"864c8629ae70baf711d32146407ec5996c1ef6a331d65ed00083b44609ca35de"} Mar 13 10:22:03 crc kubenswrapper[4841]: I0313 10:22:03.424955 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pzrth" Mar 13 10:22:04 crc kubenswrapper[4841]: I0313 10:22:04.383625 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v7ws7" podUID="79118a9f-3583-4c4f-b677-6c1a0aa7d426" containerName="registry-server" containerID="cri-o://db7167e36449b0b082f5e777458568810e0b7309d023c108ea1fdbdefadf4d80" gracePeriod=2 Mar 13 10:22:04 crc kubenswrapper[4841]: I0313 10:22:04.497916 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzrth"] Mar 13 10:22:04 crc kubenswrapper[4841]: I0313 10:22:04.799019 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556622-8vj8d" Mar 13 10:22:04 crc kubenswrapper[4841]: I0313 10:22:04.806814 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7ws7" Mar 13 10:22:04 crc kubenswrapper[4841]: I0313 10:22:04.877507 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79118a9f-3583-4c4f-b677-6c1a0aa7d426-utilities\") pod \"79118a9f-3583-4c4f-b677-6c1a0aa7d426\" (UID: \"79118a9f-3583-4c4f-b677-6c1a0aa7d426\") " Mar 13 10:22:04 crc kubenswrapper[4841]: I0313 10:22:04.877633 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqv6d\" (UniqueName: \"kubernetes.io/projected/d0ee97d0-aee2-40fc-96a7-d022734397b2-kube-api-access-cqv6d\") pod \"d0ee97d0-aee2-40fc-96a7-d022734397b2\" (UID: \"d0ee97d0-aee2-40fc-96a7-d022734397b2\") " Mar 13 10:22:04 crc kubenswrapper[4841]: I0313 10:22:04.877707 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw9k8\" (UniqueName: \"kubernetes.io/projected/79118a9f-3583-4c4f-b677-6c1a0aa7d426-kube-api-access-bw9k8\") pod \"79118a9f-3583-4c4f-b677-6c1a0aa7d426\" (UID: \"79118a9f-3583-4c4f-b677-6c1a0aa7d426\") " Mar 13 10:22:04 crc kubenswrapper[4841]: I0313 10:22:04.877779 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79118a9f-3583-4c4f-b677-6c1a0aa7d426-catalog-content\") pod \"79118a9f-3583-4c4f-b677-6c1a0aa7d426\" (UID: \"79118a9f-3583-4c4f-b677-6c1a0aa7d426\") " Mar 13 10:22:04 crc kubenswrapper[4841]: I0313 10:22:04.878725 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79118a9f-3583-4c4f-b677-6c1a0aa7d426-utilities" (OuterVolumeSpecName: "utilities") pod "79118a9f-3583-4c4f-b677-6c1a0aa7d426" (UID: "79118a9f-3583-4c4f-b677-6c1a0aa7d426"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:22:04 crc kubenswrapper[4841]: I0313 10:22:04.887402 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79118a9f-3583-4c4f-b677-6c1a0aa7d426-kube-api-access-bw9k8" (OuterVolumeSpecName: "kube-api-access-bw9k8") pod "79118a9f-3583-4c4f-b677-6c1a0aa7d426" (UID: "79118a9f-3583-4c4f-b677-6c1a0aa7d426"). InnerVolumeSpecName "kube-api-access-bw9k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:22:04 crc kubenswrapper[4841]: I0313 10:22:04.887547 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0ee97d0-aee2-40fc-96a7-d022734397b2-kube-api-access-cqv6d" (OuterVolumeSpecName: "kube-api-access-cqv6d") pod "d0ee97d0-aee2-40fc-96a7-d022734397b2" (UID: "d0ee97d0-aee2-40fc-96a7-d022734397b2"). InnerVolumeSpecName "kube-api-access-cqv6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:22:04 crc kubenswrapper[4841]: I0313 10:22:04.942533 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79118a9f-3583-4c4f-b677-6c1a0aa7d426-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79118a9f-3583-4c4f-b677-6c1a0aa7d426" (UID: "79118a9f-3583-4c4f-b677-6c1a0aa7d426"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:22:04 crc kubenswrapper[4841]: I0313 10:22:04.979467 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqv6d\" (UniqueName: \"kubernetes.io/projected/d0ee97d0-aee2-40fc-96a7-d022734397b2-kube-api-access-cqv6d\") on node \"crc\" DevicePath \"\"" Mar 13 10:22:04 crc kubenswrapper[4841]: I0313 10:22:04.979510 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw9k8\" (UniqueName: \"kubernetes.io/projected/79118a9f-3583-4c4f-b677-6c1a0aa7d426-kube-api-access-bw9k8\") on node \"crc\" DevicePath \"\"" Mar 13 10:22:04 crc kubenswrapper[4841]: I0313 10:22:04.979526 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79118a9f-3583-4c4f-b677-6c1a0aa7d426-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 10:22:04 crc kubenswrapper[4841]: I0313 10:22:04.979539 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79118a9f-3583-4c4f-b677-6c1a0aa7d426-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.402867 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7ws7" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.403179 4841 generic.go:334] "Generic (PLEG): container finished" podID="79118a9f-3583-4c4f-b677-6c1a0aa7d426" containerID="db7167e36449b0b082f5e777458568810e0b7309d023c108ea1fdbdefadf4d80" exitCode=0 Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.403018 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7ws7" event={"ID":"79118a9f-3583-4c4f-b677-6c1a0aa7d426","Type":"ContainerDied","Data":"db7167e36449b0b082f5e777458568810e0b7309d023c108ea1fdbdefadf4d80"} Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.403296 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7ws7" event={"ID":"79118a9f-3583-4c4f-b677-6c1a0aa7d426","Type":"ContainerDied","Data":"28a172c8aff8a08f07b7ba972d2e48f71b2c698d42762e8c19e3b3400619a072"} Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.403319 4841 scope.go:117] "RemoveContainer" containerID="db7167e36449b0b082f5e777458568810e0b7309d023c108ea1fdbdefadf4d80" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.431160 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556622-8vj8d" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.431227 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556622-8vj8d" event={"ID":"d0ee97d0-aee2-40fc-96a7-d022734397b2","Type":"ContainerDied","Data":"76a0bb443f86b078da7f53ee106b4408ae281546a991c44bdda8dfd19cd4a909"} Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.431255 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76a0bb443f86b078da7f53ee106b4408ae281546a991c44bdda8dfd19cd4a909" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.431332 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pzrth" podUID="2b6612a7-40b6-48db-bde8-95f80c4aead5" containerName="registry-server" containerID="cri-o://aecb762ca974b48e017773993953c236ee61a9c8963e46de2336815f76dc5e8c" gracePeriod=2 Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.457487 4841 scope.go:117] "RemoveContainer" containerID="c654eaa68711914551fc7980d2a716b7757460d8e116871cf44229652d12ddfb" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.480552 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v7ws7"] Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.496437 4841 scope.go:117] "RemoveContainer" containerID="9766b088764acffac82963e398e1075507f2d509478dc4dfef093617e77a2cde" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.506523 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v7ws7"] Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.521360 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9mxpq/must-gather-s59mc"] Mar 13 10:22:05 crc kubenswrapper[4841]: E0313 10:22:05.521848 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0ee97d0-aee2-40fc-96a7-d022734397b2" containerName="oc" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.521869 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0ee97d0-aee2-40fc-96a7-d022734397b2" containerName="oc" Mar 13 10:22:05 crc kubenswrapper[4841]: E0313 10:22:05.521891 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79118a9f-3583-4c4f-b677-6c1a0aa7d426" containerName="extract-content" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.521899 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="79118a9f-3583-4c4f-b677-6c1a0aa7d426" containerName="extract-content" Mar 13 10:22:05 crc kubenswrapper[4841]: E0313 10:22:05.521953 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79118a9f-3583-4c4f-b677-6c1a0aa7d426" containerName="registry-server" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.521963 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="79118a9f-3583-4c4f-b677-6c1a0aa7d426" containerName="registry-server" Mar 13 10:22:05 crc kubenswrapper[4841]: E0313 10:22:05.521977 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79118a9f-3583-4c4f-b677-6c1a0aa7d426" containerName="extract-utilities" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.521986 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="79118a9f-3583-4c4f-b677-6c1a0aa7d426" containerName="extract-utilities" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.522246 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="79118a9f-3583-4c4f-b677-6c1a0aa7d426" containerName="registry-server" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.522354 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0ee97d0-aee2-40fc-96a7-d022734397b2" containerName="oc" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.523756 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9mxpq/must-gather-s59mc" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.545736 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9mxpq"/"kube-root-ca.crt" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.545801 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9mxpq/must-gather-s59mc"] Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.545976 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9mxpq"/"openshift-service-ca.crt" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.590504 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psxq9\" (UniqueName: \"kubernetes.io/projected/2540fffd-8f6f-459e-a116-ce2e2c095448-kube-api-access-psxq9\") pod \"must-gather-s59mc\" (UID: \"2540fffd-8f6f-459e-a116-ce2e2c095448\") " pod="openshift-must-gather-9mxpq/must-gather-s59mc" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.590606 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2540fffd-8f6f-459e-a116-ce2e2c095448-must-gather-output\") pod \"must-gather-s59mc\" (UID: \"2540fffd-8f6f-459e-a116-ce2e2c095448\") " pod="openshift-must-gather-9mxpq/must-gather-s59mc" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.693508 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psxq9\" (UniqueName: \"kubernetes.io/projected/2540fffd-8f6f-459e-a116-ce2e2c095448-kube-api-access-psxq9\") pod \"must-gather-s59mc\" (UID: \"2540fffd-8f6f-459e-a116-ce2e2c095448\") " pod="openshift-must-gather-9mxpq/must-gather-s59mc" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.693623 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2540fffd-8f6f-459e-a116-ce2e2c095448-must-gather-output\") pod \"must-gather-s59mc\" (UID: \"2540fffd-8f6f-459e-a116-ce2e2c095448\") " pod="openshift-must-gather-9mxpq/must-gather-s59mc" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.694250 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2540fffd-8f6f-459e-a116-ce2e2c095448-must-gather-output\") pod \"must-gather-s59mc\" (UID: \"2540fffd-8f6f-459e-a116-ce2e2c095448\") " pod="openshift-must-gather-9mxpq/must-gather-s59mc" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.711417 4841 scope.go:117] "RemoveContainer" containerID="db7167e36449b0b082f5e777458568810e0b7309d023c108ea1fdbdefadf4d80" Mar 13 10:22:05 crc kubenswrapper[4841]: E0313 10:22:05.712095 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db7167e36449b0b082f5e777458568810e0b7309d023c108ea1fdbdefadf4d80\": container with ID starting with db7167e36449b0b082f5e777458568810e0b7309d023c108ea1fdbdefadf4d80 not found: ID does not exist" containerID="db7167e36449b0b082f5e777458568810e0b7309d023c108ea1fdbdefadf4d80" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.712130 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7167e36449b0b082f5e777458568810e0b7309d023c108ea1fdbdefadf4d80"} err="failed to get container status \"db7167e36449b0b082f5e777458568810e0b7309d023c108ea1fdbdefadf4d80\": rpc error: code = NotFound desc = could not find container \"db7167e36449b0b082f5e777458568810e0b7309d023c108ea1fdbdefadf4d80\": container with ID starting with db7167e36449b0b082f5e777458568810e0b7309d023c108ea1fdbdefadf4d80 not found: ID does not exist" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.712155 4841 scope.go:117] "RemoveContainer" containerID="c654eaa68711914551fc7980d2a716b7757460d8e116871cf44229652d12ddfb" Mar 13 10:22:05 crc kubenswrapper[4841]: E0313 10:22:05.715306 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c654eaa68711914551fc7980d2a716b7757460d8e116871cf44229652d12ddfb\": container with ID starting with c654eaa68711914551fc7980d2a716b7757460d8e116871cf44229652d12ddfb not found: ID does not exist" containerID="c654eaa68711914551fc7980d2a716b7757460d8e116871cf44229652d12ddfb" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.715349 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c654eaa68711914551fc7980d2a716b7757460d8e116871cf44229652d12ddfb"} err="failed to get container status \"c654eaa68711914551fc7980d2a716b7757460d8e116871cf44229652d12ddfb\": rpc error: code = NotFound desc = could not find container \"c654eaa68711914551fc7980d2a716b7757460d8e116871cf44229652d12ddfb\": container with ID starting with c654eaa68711914551fc7980d2a716b7757460d8e116871cf44229652d12ddfb not found: ID does not exist" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.715380 4841 scope.go:117] "RemoveContainer" containerID="9766b088764acffac82963e398e1075507f2d509478dc4dfef093617e77a2cde" Mar 13 10:22:05 crc kubenswrapper[4841]: E0313 10:22:05.716403 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9766b088764acffac82963e398e1075507f2d509478dc4dfef093617e77a2cde\": container with ID starting with 9766b088764acffac82963e398e1075507f2d509478dc4dfef093617e77a2cde not found: ID does not exist" containerID="9766b088764acffac82963e398e1075507f2d509478dc4dfef093617e77a2cde" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.716431 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9766b088764acffac82963e398e1075507f2d509478dc4dfef093617e77a2cde"} err="failed to get container status \"9766b088764acffac82963e398e1075507f2d509478dc4dfef093617e77a2cde\": rpc error: code = NotFound desc = could not find container \"9766b088764acffac82963e398e1075507f2d509478dc4dfef093617e77a2cde\": container with ID starting with 9766b088764acffac82963e398e1075507f2d509478dc4dfef093617e77a2cde not found: ID does not exist" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.720933 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psxq9\" (UniqueName: \"kubernetes.io/projected/2540fffd-8f6f-459e-a116-ce2e2c095448-kube-api-access-psxq9\") pod \"must-gather-s59mc\" (UID: \"2540fffd-8f6f-459e-a116-ce2e2c095448\") " pod="openshift-must-gather-9mxpq/must-gather-s59mc" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.785944 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9mxpq/must-gather-s59mc" Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.891501 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556616-g99gc"] Mar 13 10:22:05 crc kubenswrapper[4841]: I0313 10:22:05.946468 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556616-g99gc"] Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.035195 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79118a9f-3583-4c4f-b677-6c1a0aa7d426" path="/var/lib/kubelet/pods/79118a9f-3583-4c4f-b677-6c1a0aa7d426/volumes" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.036192 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ecea745-1b2d-4d7e-a791-5741c1757e51" path="/var/lib/kubelet/pods/7ecea745-1b2d-4d7e-a791-5741c1757e51/volumes" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.286774 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9mxpq/must-gather-s59mc"] Mar 13 10:22:06 crc kubenswrapper[4841]: W0313 10:22:06.287832 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2540fffd_8f6f_459e_a116_ce2e2c095448.slice/crio-67c055bbde63d865098592a5c61e36e83a5abdd5bbb0c562add5375dd2f6b687 WatchSource:0}: Error finding container 67c055bbde63d865098592a5c61e36e83a5abdd5bbb0c562add5375dd2f6b687: Status 404 returned error can't find the container with id 67c055bbde63d865098592a5c61e36e83a5abdd5bbb0c562add5375dd2f6b687 Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.337641 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzrth" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.432762 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b6612a7-40b6-48db-bde8-95f80c4aead5-utilities\") pod \"2b6612a7-40b6-48db-bde8-95f80c4aead5\" (UID: \"2b6612a7-40b6-48db-bde8-95f80c4aead5\") " Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.433043 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b6612a7-40b6-48db-bde8-95f80c4aead5-catalog-content\") pod \"2b6612a7-40b6-48db-bde8-95f80c4aead5\" (UID: \"2b6612a7-40b6-48db-bde8-95f80c4aead5\") " Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.433081 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wztb\" (UniqueName: \"kubernetes.io/projected/2b6612a7-40b6-48db-bde8-95f80c4aead5-kube-api-access-4wztb\") pod \"2b6612a7-40b6-48db-bde8-95f80c4aead5\" (UID: \"2b6612a7-40b6-48db-bde8-95f80c4aead5\") " Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.434845 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b6612a7-40b6-48db-bde8-95f80c4aead5-utilities" (OuterVolumeSpecName: "utilities") pod "2b6612a7-40b6-48db-bde8-95f80c4aead5" (UID: "2b6612a7-40b6-48db-bde8-95f80c4aead5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.445521 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b6612a7-40b6-48db-bde8-95f80c4aead5-kube-api-access-4wztb" (OuterVolumeSpecName: "kube-api-access-4wztb") pod "2b6612a7-40b6-48db-bde8-95f80c4aead5" (UID: "2b6612a7-40b6-48db-bde8-95f80c4aead5"). InnerVolumeSpecName "kube-api-access-4wztb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.460657 4841 generic.go:334] "Generic (PLEG): container finished" podID="2b6612a7-40b6-48db-bde8-95f80c4aead5" containerID="aecb762ca974b48e017773993953c236ee61a9c8963e46de2336815f76dc5e8c" exitCode=0 Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.460709 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzrth" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.460750 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzrth" event={"ID":"2b6612a7-40b6-48db-bde8-95f80c4aead5","Type":"ContainerDied","Data":"aecb762ca974b48e017773993953c236ee61a9c8963e46de2336815f76dc5e8c"} Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.460784 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzrth" event={"ID":"2b6612a7-40b6-48db-bde8-95f80c4aead5","Type":"ContainerDied","Data":"e12a1d531798100f342f24943192b22495aaf1b81d442bef6d22e6349f494d9e"} Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.460808 4841 scope.go:117] "RemoveContainer" containerID="aecb762ca974b48e017773993953c236ee61a9c8963e46de2336815f76dc5e8c" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.468814 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9mxpq/must-gather-s59mc" event={"ID":"2540fffd-8f6f-459e-a116-ce2e2c095448","Type":"ContainerStarted","Data":"67c055bbde63d865098592a5c61e36e83a5abdd5bbb0c562add5375dd2f6b687"} Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.471841 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b6612a7-40b6-48db-bde8-95f80c4aead5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b6612a7-40b6-48db-bde8-95f80c4aead5" (UID: "2b6612a7-40b6-48db-bde8-95f80c4aead5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.502096 4841 scope.go:117] "RemoveContainer" containerID="a06bc05474b3b84b195f20ac4a1fdf56006731168d5e7192ef0f2debcb3064fd" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.535788 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b6612a7-40b6-48db-bde8-95f80c4aead5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.535818 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wztb\" (UniqueName: \"kubernetes.io/projected/2b6612a7-40b6-48db-bde8-95f80c4aead5-kube-api-access-4wztb\") on node \"crc\" DevicePath \"\"" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.535828 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b6612a7-40b6-48db-bde8-95f80c4aead5-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.578584 4841 scope.go:117] "RemoveContainer" containerID="86c27384ec9ccd9a40fdd3760600d1291db54a082d7140d11f967fb686d8a935" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.691793 4841 scope.go:117] "RemoveContainer" containerID="aecb762ca974b48e017773993953c236ee61a9c8963e46de2336815f76dc5e8c" Mar 13 10:22:06 crc kubenswrapper[4841]: E0313 10:22:06.692278 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aecb762ca974b48e017773993953c236ee61a9c8963e46de2336815f76dc5e8c\": container with ID starting with aecb762ca974b48e017773993953c236ee61a9c8963e46de2336815f76dc5e8c not found: ID does not exist" containerID="aecb762ca974b48e017773993953c236ee61a9c8963e46de2336815f76dc5e8c" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.692333 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aecb762ca974b48e017773993953c236ee61a9c8963e46de2336815f76dc5e8c"} err="failed to get container status \"aecb762ca974b48e017773993953c236ee61a9c8963e46de2336815f76dc5e8c\": rpc error: code = NotFound desc = could not find container \"aecb762ca974b48e017773993953c236ee61a9c8963e46de2336815f76dc5e8c\": container with ID starting with aecb762ca974b48e017773993953c236ee61a9c8963e46de2336815f76dc5e8c not found: ID does not exist" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.692373 4841 scope.go:117] "RemoveContainer" containerID="a06bc05474b3b84b195f20ac4a1fdf56006731168d5e7192ef0f2debcb3064fd" Mar 13 10:22:06 crc kubenswrapper[4841]: E0313 10:22:06.692682 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a06bc05474b3b84b195f20ac4a1fdf56006731168d5e7192ef0f2debcb3064fd\": container with ID starting with a06bc05474b3b84b195f20ac4a1fdf56006731168d5e7192ef0f2debcb3064fd not found: ID does not exist" containerID="a06bc05474b3b84b195f20ac4a1fdf56006731168d5e7192ef0f2debcb3064fd" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.692715 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a06bc05474b3b84b195f20ac4a1fdf56006731168d5e7192ef0f2debcb3064fd"} err="failed to get container status \"a06bc05474b3b84b195f20ac4a1fdf56006731168d5e7192ef0f2debcb3064fd\": rpc error: code = NotFound desc = could not find container \"a06bc05474b3b84b195f20ac4a1fdf56006731168d5e7192ef0f2debcb3064fd\": container with ID starting with a06bc05474b3b84b195f20ac4a1fdf56006731168d5e7192ef0f2debcb3064fd not found: ID does not exist" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.692736 4841 scope.go:117] "RemoveContainer" containerID="86c27384ec9ccd9a40fdd3760600d1291db54a082d7140d11f967fb686d8a935" Mar 13 10:22:06 crc kubenswrapper[4841]: E0313 10:22:06.693002 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86c27384ec9ccd9a40fdd3760600d1291db54a082d7140d11f967fb686d8a935\": container with ID starting with 86c27384ec9ccd9a40fdd3760600d1291db54a082d7140d11f967fb686d8a935 not found: ID does not exist" containerID="86c27384ec9ccd9a40fdd3760600d1291db54a082d7140d11f967fb686d8a935" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.693035 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c27384ec9ccd9a40fdd3760600d1291db54a082d7140d11f967fb686d8a935"} err="failed to get container status \"86c27384ec9ccd9a40fdd3760600d1291db54a082d7140d11f967fb686d8a935\": rpc error: code = NotFound desc = could not find container \"86c27384ec9ccd9a40fdd3760600d1291db54a082d7140d11f967fb686d8a935\": container with ID starting with 86c27384ec9ccd9a40fdd3760600d1291db54a082d7140d11f967fb686d8a935 not found: ID does not exist" Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.797254 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzrth"] Mar 13 10:22:06 crc kubenswrapper[4841]: I0313 10:22:06.807361 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzrth"] Mar 13 10:22:07 crc kubenswrapper[4841]: I0313 10:22:07.198324 4841 scope.go:117] "RemoveContainer" containerID="cfcd4490f65a3b6445ca3f33b47c66ce5e1da712532cd323e6ffbe128043f05f" Mar 13 10:22:07 crc kubenswrapper[4841]: I0313 10:22:07.478758 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9mxpq/must-gather-s59mc" event={"ID":"2540fffd-8f6f-459e-a116-ce2e2c095448","Type":"ContainerStarted","Data":"5f0def84d86f11057b964adcd9ae4195af218b44c28ab2a99fba345cac48660d"} Mar 13 10:22:07 crc kubenswrapper[4841]: I0313 10:22:07.478807 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9mxpq/must-gather-s59mc" event={"ID":"2540fffd-8f6f-459e-a116-ce2e2c095448","Type":"ContainerStarted","Data":"8e5dc5a9153b3abefa2b8e46fba2306dc3c0f0825dc8ba90730a784b3ff8c265"} Mar 13 10:22:07 crc kubenswrapper[4841]: I0313 10:22:07.503057 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9mxpq/must-gather-s59mc" podStartSLOduration=2.503036635 podStartE2EDuration="2.503036635s" podCreationTimestamp="2026-03-13 10:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:22:07.494821797 +0000 UTC m=+4210.224722018" watchObservedRunningTime="2026-03-13 10:22:07.503036635 +0000 UTC m=+4210.232936816" Mar 13 10:22:08 crc kubenswrapper[4841]: I0313 10:22:08.004510 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b6612a7-40b6-48db-bde8-95f80c4aead5" path="/var/lib/kubelet/pods/2b6612a7-40b6-48db-bde8-95f80c4aead5/volumes" Mar 13 10:22:10 crc kubenswrapper[4841]: I0313 10:22:10.480545 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9mxpq/crc-debug-zpxxm"] Mar 13 10:22:10 crc kubenswrapper[4841]: E0313 10:22:10.481425 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6612a7-40b6-48db-bde8-95f80c4aead5" containerName="registry-server" Mar 13 10:22:10 crc kubenswrapper[4841]: I0313 10:22:10.481440 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6612a7-40b6-48db-bde8-95f80c4aead5" containerName="registry-server" Mar 13 10:22:10 crc kubenswrapper[4841]: E0313 10:22:10.481468 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6612a7-40b6-48db-bde8-95f80c4aead5" containerName="extract-utilities" Mar 13 10:22:10 crc kubenswrapper[4841]: I0313 10:22:10.481474 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6612a7-40b6-48db-bde8-95f80c4aead5" containerName="extract-utilities" Mar 13 10:22:10 crc kubenswrapper[4841]: E0313 10:22:10.481495 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6612a7-40b6-48db-bde8-95f80c4aead5" containerName="extract-content" Mar 13 10:22:10 crc kubenswrapper[4841]: I0313 10:22:10.481501 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6612a7-40b6-48db-bde8-95f80c4aead5" containerName="extract-content" Mar 13 10:22:10 crc kubenswrapper[4841]: I0313 10:22:10.481672 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b6612a7-40b6-48db-bde8-95f80c4aead5" containerName="registry-server" Mar 13 10:22:10 crc kubenswrapper[4841]: I0313 10:22:10.482324 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9mxpq/crc-debug-zpxxm" Mar 13 10:22:10 crc kubenswrapper[4841]: I0313 10:22:10.484568 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9mxpq"/"default-dockercfg-5kk6z" Mar 13 10:22:10 crc kubenswrapper[4841]: I0313 10:22:10.631963 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8693b0a4-1501-4ba6-a3e9-c8c118be1eb1-host\") pod \"crc-debug-zpxxm\" (UID: \"8693b0a4-1501-4ba6-a3e9-c8c118be1eb1\") " pod="openshift-must-gather-9mxpq/crc-debug-zpxxm" Mar 13 10:22:10 crc kubenswrapper[4841]: I0313 10:22:10.632403 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wsrl\" (UniqueName: \"kubernetes.io/projected/8693b0a4-1501-4ba6-a3e9-c8c118be1eb1-kube-api-access-8wsrl\") pod \"crc-debug-zpxxm\" (UID: \"8693b0a4-1501-4ba6-a3e9-c8c118be1eb1\") " pod="openshift-must-gather-9mxpq/crc-debug-zpxxm" Mar 13 10:22:10 crc kubenswrapper[4841]: I0313 10:22:10.734028 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wsrl\" (UniqueName: \"kubernetes.io/projected/8693b0a4-1501-4ba6-a3e9-c8c118be1eb1-kube-api-access-8wsrl\") pod \"crc-debug-zpxxm\" (UID: \"8693b0a4-1501-4ba6-a3e9-c8c118be1eb1\") " pod="openshift-must-gather-9mxpq/crc-debug-zpxxm" Mar 13 10:22:10 crc kubenswrapper[4841]: I0313 10:22:10.734182 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8693b0a4-1501-4ba6-a3e9-c8c118be1eb1-host\") pod \"crc-debug-zpxxm\" (UID: \"8693b0a4-1501-4ba6-a3e9-c8c118be1eb1\") " pod="openshift-must-gather-9mxpq/crc-debug-zpxxm" Mar 13 10:22:10 crc kubenswrapper[4841]: I0313 10:22:10.734368 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8693b0a4-1501-4ba6-a3e9-c8c118be1eb1-host\") pod \"crc-debug-zpxxm\" (UID: \"8693b0a4-1501-4ba6-a3e9-c8c118be1eb1\") " pod="openshift-must-gather-9mxpq/crc-debug-zpxxm" Mar 13 10:22:10 crc kubenswrapper[4841]: I0313 10:22:10.994177 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wsrl\" (UniqueName: \"kubernetes.io/projected/8693b0a4-1501-4ba6-a3e9-c8c118be1eb1-kube-api-access-8wsrl\") pod \"crc-debug-zpxxm\" (UID: \"8693b0a4-1501-4ba6-a3e9-c8c118be1eb1\") " pod="openshift-must-gather-9mxpq/crc-debug-zpxxm" Mar 13 10:22:11 crc kubenswrapper[4841]: I0313 10:22:11.102582 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9mxpq/crc-debug-zpxxm" Mar 13 10:22:11 crc kubenswrapper[4841]: W0313 10:22:11.136857 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8693b0a4_1501_4ba6_a3e9_c8c118be1eb1.slice/crio-af7a84e74d5fb0cd07109e79b147c6e4d7243873ef5bab811717a0045f9f56e7 WatchSource:0}: Error finding container af7a84e74d5fb0cd07109e79b147c6e4d7243873ef5bab811717a0045f9f56e7: Status 404 returned error can't find the container with id af7a84e74d5fb0cd07109e79b147c6e4d7243873ef5bab811717a0045f9f56e7 Mar 13 10:22:11 crc kubenswrapper[4841]: I0313 10:22:11.514622 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9mxpq/crc-debug-zpxxm" event={"ID":"8693b0a4-1501-4ba6-a3e9-c8c118be1eb1","Type":"ContainerStarted","Data":"11f3aff671610bb424282d1b8a02085796b950d11c01749889bce84622324af9"} Mar 13 10:22:11 crc kubenswrapper[4841]: I0313 10:22:11.515241 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9mxpq/crc-debug-zpxxm" event={"ID":"8693b0a4-1501-4ba6-a3e9-c8c118be1eb1","Type":"ContainerStarted","Data":"af7a84e74d5fb0cd07109e79b147c6e4d7243873ef5bab811717a0045f9f56e7"} Mar 13 10:22:11 crc kubenswrapper[4841]: I0313 10:22:11.538831 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9mxpq/crc-debug-zpxxm" podStartSLOduration=1.538796863 podStartE2EDuration="1.538796863s" podCreationTimestamp="2026-03-13 10:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:22:11.526860209 +0000 UTC m=+4214.256760400" watchObservedRunningTime="2026-03-13 10:22:11.538796863 +0000 UTC m=+4214.268697044" Mar 13 10:22:23 crc kubenswrapper[4841]: I0313 10:22:23.645863 4841 generic.go:334] "Generic (PLEG): container finished" podID="8693b0a4-1501-4ba6-a3e9-c8c118be1eb1" containerID="11f3aff671610bb424282d1b8a02085796b950d11c01749889bce84622324af9" exitCode=0 Mar 13 10:22:23 crc kubenswrapper[4841]: I0313 10:22:23.645949 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9mxpq/crc-debug-zpxxm" event={"ID":"8693b0a4-1501-4ba6-a3e9-c8c118be1eb1","Type":"ContainerDied","Data":"11f3aff671610bb424282d1b8a02085796b950d11c01749889bce84622324af9"} Mar 13 10:22:24 crc kubenswrapper[4841]: I0313 10:22:24.814889 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9mxpq/crc-debug-zpxxm" Mar 13 10:22:24 crc kubenswrapper[4841]: I0313 10:22:24.855932 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9mxpq/crc-debug-zpxxm"] Mar 13 10:22:24 crc kubenswrapper[4841]: I0313 10:22:24.864797 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9mxpq/crc-debug-zpxxm"] Mar 13 10:22:24 crc kubenswrapper[4841]: I0313 10:22:24.929124 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8693b0a4-1501-4ba6-a3e9-c8c118be1eb1-host\") pod \"8693b0a4-1501-4ba6-a3e9-c8c118be1eb1\" (UID: \"8693b0a4-1501-4ba6-a3e9-c8c118be1eb1\") " Mar 13 10:22:24 crc kubenswrapper[4841]: I0313 10:22:24.929228 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wsrl\" (UniqueName: \"kubernetes.io/projected/8693b0a4-1501-4ba6-a3e9-c8c118be1eb1-kube-api-access-8wsrl\") pod \"8693b0a4-1501-4ba6-a3e9-c8c118be1eb1\" (UID: \"8693b0a4-1501-4ba6-a3e9-c8c118be1eb1\") " Mar 13 10:22:24 crc kubenswrapper[4841]: I0313 10:22:24.929249 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8693b0a4-1501-4ba6-a3e9-c8c118be1eb1-host" (OuterVolumeSpecName: "host") pod "8693b0a4-1501-4ba6-a3e9-c8c118be1eb1" (UID: "8693b0a4-1501-4ba6-a3e9-c8c118be1eb1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:22:24 crc kubenswrapper[4841]: I0313 10:22:24.929949 4841 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8693b0a4-1501-4ba6-a3e9-c8c118be1eb1-host\") on node \"crc\" DevicePath \"\"" Mar 13 10:22:24 crc kubenswrapper[4841]: I0313 10:22:24.950589 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8693b0a4-1501-4ba6-a3e9-c8c118be1eb1-kube-api-access-8wsrl" (OuterVolumeSpecName: "kube-api-access-8wsrl") pod "8693b0a4-1501-4ba6-a3e9-c8c118be1eb1" (UID: "8693b0a4-1501-4ba6-a3e9-c8c118be1eb1"). InnerVolumeSpecName "kube-api-access-8wsrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:22:25 crc kubenswrapper[4841]: I0313 10:22:25.031902 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wsrl\" (UniqueName: \"kubernetes.io/projected/8693b0a4-1501-4ba6-a3e9-c8c118be1eb1-kube-api-access-8wsrl\") on node \"crc\" DevicePath \"\"" Mar 13 10:22:25 crc kubenswrapper[4841]: I0313 10:22:25.706765 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af7a84e74d5fb0cd07109e79b147c6e4d7243873ef5bab811717a0045f9f56e7" Mar 13 10:22:25 crc kubenswrapper[4841]: I0313 10:22:25.706834 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9mxpq/crc-debug-zpxxm" Mar 13 10:22:26 crc kubenswrapper[4841]: I0313 10:22:26.008683 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8693b0a4-1501-4ba6-a3e9-c8c118be1eb1" path="/var/lib/kubelet/pods/8693b0a4-1501-4ba6-a3e9-c8c118be1eb1/volumes" Mar 13 10:22:26 crc kubenswrapper[4841]: I0313 10:22:26.049984 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9mxpq/crc-debug-6nzrx"] Mar 13 10:22:26 crc kubenswrapper[4841]: E0313 10:22:26.050417 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8693b0a4-1501-4ba6-a3e9-c8c118be1eb1" containerName="container-00" Mar 13 10:22:26 crc kubenswrapper[4841]: I0313 10:22:26.050434 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8693b0a4-1501-4ba6-a3e9-c8c118be1eb1" containerName="container-00" Mar 13 10:22:26 crc kubenswrapper[4841]: I0313 10:22:26.050620 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8693b0a4-1501-4ba6-a3e9-c8c118be1eb1" containerName="container-00" Mar 13 10:22:26 crc kubenswrapper[4841]: I0313 10:22:26.051375 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9mxpq/crc-debug-6nzrx" Mar 13 10:22:26 crc kubenswrapper[4841]: I0313 10:22:26.053059 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9mxpq"/"default-dockercfg-5kk6z" Mar 13 10:22:26 crc kubenswrapper[4841]: I0313 10:22:26.154252 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dh64\" (UniqueName: \"kubernetes.io/projected/34a21c54-2659-4c9a-93bb-ee9c129677ed-kube-api-access-4dh64\") pod \"crc-debug-6nzrx\" (UID: \"34a21c54-2659-4c9a-93bb-ee9c129677ed\") " pod="openshift-must-gather-9mxpq/crc-debug-6nzrx" Mar 13 10:22:26 crc kubenswrapper[4841]: I0313 10:22:26.154380 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34a21c54-2659-4c9a-93bb-ee9c129677ed-host\") pod \"crc-debug-6nzrx\" (UID: \"34a21c54-2659-4c9a-93bb-ee9c129677ed\") " pod="openshift-must-gather-9mxpq/crc-debug-6nzrx" Mar 13 10:22:26 crc kubenswrapper[4841]: I0313 10:22:26.256693 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dh64\" (UniqueName: \"kubernetes.io/projected/34a21c54-2659-4c9a-93bb-ee9c129677ed-kube-api-access-4dh64\") pod \"crc-debug-6nzrx\" (UID: \"34a21c54-2659-4c9a-93bb-ee9c129677ed\") " pod="openshift-must-gather-9mxpq/crc-debug-6nzrx" Mar 13 10:22:26 crc kubenswrapper[4841]: I0313 10:22:26.256781 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34a21c54-2659-4c9a-93bb-ee9c129677ed-host\") pod \"crc-debug-6nzrx\" (UID: \"34a21c54-2659-4c9a-93bb-ee9c129677ed\") " pod="openshift-must-gather-9mxpq/crc-debug-6nzrx" Mar 13 10:22:26 crc kubenswrapper[4841]: I0313 10:22:26.256886 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34a21c54-2659-4c9a-93bb-ee9c129677ed-host\") pod \"crc-debug-6nzrx\" (UID: \"34a21c54-2659-4c9a-93bb-ee9c129677ed\") " pod="openshift-must-gather-9mxpq/crc-debug-6nzrx" Mar 13 10:22:26 crc kubenswrapper[4841]: I0313 10:22:26.283724 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dh64\" (UniqueName: \"kubernetes.io/projected/34a21c54-2659-4c9a-93bb-ee9c129677ed-kube-api-access-4dh64\") pod \"crc-debug-6nzrx\" (UID: \"34a21c54-2659-4c9a-93bb-ee9c129677ed\") " pod="openshift-must-gather-9mxpq/crc-debug-6nzrx" Mar 13 10:22:26 crc kubenswrapper[4841]: I0313 10:22:26.366915 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9mxpq/crc-debug-6nzrx" Mar 13 10:22:26 crc kubenswrapper[4841]: W0313 10:22:26.415529 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34a21c54_2659_4c9a_93bb_ee9c129677ed.slice/crio-5933149c2c7ada5f2c0ea9371fa65762806df6cf1c54d6e575187ff602678a5b WatchSource:0}: Error finding container 5933149c2c7ada5f2c0ea9371fa65762806df6cf1c54d6e575187ff602678a5b: Status 404 returned error can't find the container with id 5933149c2c7ada5f2c0ea9371fa65762806df6cf1c54d6e575187ff602678a5b Mar 13 10:22:26 crc kubenswrapper[4841]: I0313 10:22:26.719679 4841 generic.go:334] "Generic (PLEG): container finished" podID="34a21c54-2659-4c9a-93bb-ee9c129677ed" containerID="7d4f1111c20666d9ac207f5c6221147d8b2363234fadfdac3eaa5dd95460f2a4" exitCode=1 Mar 13 10:22:26 crc kubenswrapper[4841]: I0313 10:22:26.719770 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9mxpq/crc-debug-6nzrx" event={"ID":"34a21c54-2659-4c9a-93bb-ee9c129677ed","Type":"ContainerDied","Data":"7d4f1111c20666d9ac207f5c6221147d8b2363234fadfdac3eaa5dd95460f2a4"} Mar 13 10:22:26 crc kubenswrapper[4841]: I0313 10:22:26.720100 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9mxpq/crc-debug-6nzrx" event={"ID":"34a21c54-2659-4c9a-93bb-ee9c129677ed","Type":"ContainerStarted","Data":"5933149c2c7ada5f2c0ea9371fa65762806df6cf1c54d6e575187ff602678a5b"} Mar 13 10:22:26 crc kubenswrapper[4841]: I0313 10:22:26.765422 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9mxpq/crc-debug-6nzrx"] Mar 13 10:22:26 crc kubenswrapper[4841]: I0313 10:22:26.775046 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9mxpq/crc-debug-6nzrx"] Mar 13 10:22:27 crc kubenswrapper[4841]: I0313 10:22:27.824212 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9mxpq/crc-debug-6nzrx" Mar 13 10:22:28 crc kubenswrapper[4841]: I0313 10:22:28.002657 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dh64\" (UniqueName: \"kubernetes.io/projected/34a21c54-2659-4c9a-93bb-ee9c129677ed-kube-api-access-4dh64\") pod \"34a21c54-2659-4c9a-93bb-ee9c129677ed\" (UID: \"34a21c54-2659-4c9a-93bb-ee9c129677ed\") " Mar 13 10:22:28 crc kubenswrapper[4841]: I0313 10:22:28.003044 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34a21c54-2659-4c9a-93bb-ee9c129677ed-host\") pod \"34a21c54-2659-4c9a-93bb-ee9c129677ed\" (UID: \"34a21c54-2659-4c9a-93bb-ee9c129677ed\") " Mar 13 10:22:28 crc kubenswrapper[4841]: I0313 10:22:28.003278 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34a21c54-2659-4c9a-93bb-ee9c129677ed-host" (OuterVolumeSpecName: "host") pod "34a21c54-2659-4c9a-93bb-ee9c129677ed" (UID: "34a21c54-2659-4c9a-93bb-ee9c129677ed"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:22:28 crc kubenswrapper[4841]: I0313 10:22:28.004591 4841 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34a21c54-2659-4c9a-93bb-ee9c129677ed-host\") on node \"crc\" DevicePath \"\"" Mar 13 10:22:28 crc kubenswrapper[4841]: I0313 10:22:28.011462 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34a21c54-2659-4c9a-93bb-ee9c129677ed-kube-api-access-4dh64" (OuterVolumeSpecName: "kube-api-access-4dh64") pod "34a21c54-2659-4c9a-93bb-ee9c129677ed" (UID: "34a21c54-2659-4c9a-93bb-ee9c129677ed"). InnerVolumeSpecName "kube-api-access-4dh64". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:22:28 crc kubenswrapper[4841]: I0313 10:22:28.105677 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dh64\" (UniqueName: \"kubernetes.io/projected/34a21c54-2659-4c9a-93bb-ee9c129677ed-kube-api-access-4dh64\") on node \"crc\" DevicePath \"\"" Mar 13 10:22:28 crc kubenswrapper[4841]: I0313 10:22:28.738575 4841 scope.go:117] "RemoveContainer" containerID="7d4f1111c20666d9ac207f5c6221147d8b2363234fadfdac3eaa5dd95460f2a4" Mar 13 10:22:28 crc kubenswrapper[4841]: I0313 10:22:28.738635 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9mxpq/crc-debug-6nzrx" Mar 13 10:22:30 crc kubenswrapper[4841]: I0313 10:22:30.006537 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34a21c54-2659-4c9a-93bb-ee9c129677ed" path="/var/lib/kubelet/pods/34a21c54-2659-4c9a-93bb-ee9c129677ed/volumes" Mar 13 10:23:34 crc kubenswrapper[4841]: I0313 10:23:34.407044 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 10:23:34 crc kubenswrapper[4841]: I0313 10:23:34.407659 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 10:23:44 crc kubenswrapper[4841]: I0313 10:23:44.881179 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0812602d-3596-4cda-b90a-d2f76f67bf52/init-config-reloader/0.log" Mar 13 10:23:45 crc kubenswrapper[4841]: I0313 10:23:45.129570 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0812602d-3596-4cda-b90a-d2f76f67bf52/init-config-reloader/0.log" Mar 13 10:23:45 crc kubenswrapper[4841]: I0313 10:23:45.181490 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0812602d-3596-4cda-b90a-d2f76f67bf52/alertmanager/0.log" Mar 13 10:23:45 crc kubenswrapper[4841]: I0313 10:23:45.237000 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0812602d-3596-4cda-b90a-d2f76f67bf52/config-reloader/0.log" Mar 13 10:23:45 crc kubenswrapper[4841]: I0313 10:23:45.318817 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_46839f95-04c1-47d3-b63c-c9e2d80b681a/aodh-api/0.log" Mar 13 10:23:45 crc kubenswrapper[4841]: I0313 10:23:45.367818 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_46839f95-04c1-47d3-b63c-c9e2d80b681a/aodh-evaluator/0.log" Mar 13 10:23:45 crc kubenswrapper[4841]: I0313 10:23:45.421971 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_46839f95-04c1-47d3-b63c-c9e2d80b681a/aodh-listener/0.log" Mar 13 10:23:45 crc kubenswrapper[4841]: I0313 10:23:45.440319 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_46839f95-04c1-47d3-b63c-c9e2d80b681a/aodh-notifier/0.log" Mar 13 10:23:45 crc kubenswrapper[4841]: I0313 10:23:45.564383 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5dbfbd46f8-tjjrf_728289d9-1ed1-449a-99e7-85da0a025366/barbican-api/0.log" Mar 13 10:23:45 crc kubenswrapper[4841]: I0313 10:23:45.609492 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5dbfbd46f8-tjjrf_728289d9-1ed1-449a-99e7-85da0a025366/barbican-api-log/0.log" Mar 13 10:23:45 crc kubenswrapper[4841]: I0313 10:23:45.794023 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-d94989454-4npv2_0a624af3-f727-4d7e-8b59-6c45863bfcea/barbican-keystone-listener/0.log" Mar 13 10:23:45 crc kubenswrapper[4841]: I0313 10:23:45.844339 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-d94989454-4npv2_0a624af3-f727-4d7e-8b59-6c45863bfcea/barbican-keystone-listener-log/0.log" Mar 13 10:23:45 crc kubenswrapper[4841]: I0313 10:23:45.947950 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-fdcb67bff-tvnvp_6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd/barbican-worker/0.log" Mar 13 10:23:45 crc kubenswrapper[4841]: I0313 10:23:45.995651 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-fdcb67bff-tvnvp_6a1295b1-dcb3-4a4b-9d53-9c202cc78dcd/barbican-worker-log/0.log" Mar 13 10:23:46 crc kubenswrapper[4841]: I0313 10:23:46.148744 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-qfrz6_c3c0bc1a-b192-44f6-a237-9242d36513ce/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:23:46 crc kubenswrapper[4841]: I0313 10:23:46.241604 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_661cdaa4-34e4-47df-9bdb-95d67c012cff/ceilometer-central-agent/0.log" Mar 13 10:23:46 crc kubenswrapper[4841]: I0313 10:23:46.354013 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_661cdaa4-34e4-47df-9bdb-95d67c012cff/ceilometer-notification-agent/0.log" Mar 13 10:23:46 crc kubenswrapper[4841]: I0313 10:23:46.386936 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_661cdaa4-34e4-47df-9bdb-95d67c012cff/proxy-httpd/0.log" Mar 13 10:23:46 crc kubenswrapper[4841]: I0313 10:23:46.393385 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_661cdaa4-34e4-47df-9bdb-95d67c012cff/sg-core/0.log" Mar 13 10:23:46 crc kubenswrapper[4841]: I0313 10:23:46.628738 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_88dbe267-3d86-4bcd-8654-79392e0c502d/cinder-api-log/0.log" Mar 13 10:23:46 crc kubenswrapper[4841]: I0313 10:23:46.658295 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_88dbe267-3d86-4bcd-8654-79392e0c502d/cinder-api/0.log" Mar 13 10:23:46 crc kubenswrapper[4841]: I0313 10:23:46.751233 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_626bc701-ba99-4de7-a2f9-b42eb150a783/cinder-scheduler/0.log" Mar 13 10:23:46 crc kubenswrapper[4841]: I0313 10:23:46.851578 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_626bc701-ba99-4de7-a2f9-b42eb150a783/probe/0.log" Mar 13 10:23:46 crc kubenswrapper[4841]: I0313 10:23:46.907638 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-8smcd_59271a3d-6406-4e1f-a783-ba324ef8dece/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:23:47 crc kubenswrapper[4841]: I0313 10:23:47.120355 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-lwz9p_f3c90f3c-6382-4a13-b4cd-515cfe68538e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:23:47 crc kubenswrapper[4841]: I0313 10:23:47.135118 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-6rf6l_2c632011-0a35-4eaa-a7f5-8d86466858ca/init/0.log" Mar 13 10:23:47 crc kubenswrapper[4841]: I0313 10:23:47.359993 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-gb7dk_511f249d-a5ba-4a19-a5b6-16b5c75fe538/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:23:47 crc kubenswrapper[4841]: I0313 10:23:47.375100 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-6rf6l_2c632011-0a35-4eaa-a7f5-8d86466858ca/init/0.log" Mar 13 10:23:47 crc kubenswrapper[4841]: I0313 10:23:47.375214 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-6rf6l_2c632011-0a35-4eaa-a7f5-8d86466858ca/dnsmasq-dns/0.log" Mar 13 10:23:47 crc kubenswrapper[4841]: I0313 10:23:47.814643 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9/glance-log/0.log" Mar 13 10:23:47 crc kubenswrapper[4841]: I0313 10:23:47.829200 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3ea1c7f2-f09a-4f0a-bead-e767aa9c05c9/glance-httpd/0.log" Mar 13 10:23:47 crc kubenswrapper[4841]: I0313 10:23:47.993005 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c0a786f1-b124-492e-80a6-6b7df2ad7bd3/glance-httpd/0.log" Mar 13 10:23:48 crc kubenswrapper[4841]: I0313 10:23:48.025958 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c0a786f1-b124-492e-80a6-6b7df2ad7bd3/glance-log/0.log" Mar 13 10:23:48 crc kubenswrapper[4841]: I0313 10:23:48.512557 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-74f4d87c9f-bw7dr_f1ddc522-5255-4785-8d33-85a3d0e86af2/heat-engine/0.log" Mar 13 10:23:48 crc kubenswrapper[4841]: I0313 10:23:48.544835 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-bb2dv_9904117c-604f-48b6-9f6b-ef60210b0a94/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:23:48 crc kubenswrapper[4841]: I0313 10:23:48.583057 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-5ff95dc669-rzhtr_595c0935-7197-4c48-be0d-8a3ad4d6442d/heat-api/0.log" Mar 13 10:23:48 crc kubenswrapper[4841]: I0313 10:23:48.685107 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-fb97f87fc-9tb45_6b447cf0-3120-4329-9dbf-534fd45e70bf/heat-cfnapi/0.log" Mar 13 10:23:48 crc kubenswrapper[4841]: I0313 10:23:48.794210 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-szndw_6e611c0f-aa46-4280-ae2d-bdff4bf61b60/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:23:48 crc kubenswrapper[4841]: I0313 10:23:48.897827 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7b87d5dbb8-7ppv5_9ef13028-1aeb-4a08-b241-fa033413b353/keystone-api/0.log" Mar 13 10:23:48 crc kubenswrapper[4841]: I0313 10:23:48.918556 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29556601-kkh2k_d8a76cc0-6588-4160-8580-766a47f207e6/keystone-cron/0.log" Mar 13 10:23:49 crc kubenswrapper[4841]: I0313 10:23:49.067976 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8b082eb0-dc81-49f6-a313-07507e296c71/kube-state-metrics/0.log" Mar 13 10:23:49 crc kubenswrapper[4841]: I0313 10:23:49.160365 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-xcmkq_4695a6ba-f70e-43c5-9f8f-b8fcbba9fd77/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:23:49 crc kubenswrapper[4841]: I0313 10:23:49.411397 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d768747c7-2ssnn_d209e4b8-27eb-4fea-ad65-807001e8638c/neutron-api/0.log" Mar 13 10:23:49 crc kubenswrapper[4841]: I0313 10:23:49.498120 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d768747c7-2ssnn_d209e4b8-27eb-4fea-ad65-807001e8638c/neutron-httpd/0.log" Mar 13 10:23:49 crc kubenswrapper[4841]: I0313 10:23:49.696137 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-zt7rv_dc97298b-a706-488a-9ea1-e90de447c754/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:23:50 crc kubenswrapper[4841]: I0313 10:23:50.016796 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_7666c55c-f424-4bfd-a143-e768e534b721/nova-api-log/0.log" Mar 13 10:23:50 crc kubenswrapper[4841]: I0313 10:23:50.076873 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_96d3458e-2994-4e4a-97b9-738366b67d8e/nova-cell0-conductor-conductor/0.log" Mar 13 10:23:50 crc kubenswrapper[4841]: I0313 10:23:50.203309 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_7666c55c-f424-4bfd-a143-e768e534b721/nova-api-api/0.log" Mar 13 10:23:50 crc kubenswrapper[4841]: I0313 10:23:50.480726 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_8f70f1b2-8e4f-4738-9b35-2a5e75f92988/nova-cell1-conductor-conductor/0.log" Mar 13 10:23:50 crc kubenswrapper[4841]: I0313 10:23:50.573106 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_c1840df1-8c0f-4038-9389-eaf2bcc61705/nova-cell1-novncproxy-novncproxy/0.log" Mar 13 10:23:50 crc kubenswrapper[4841]: I0313 10:23:50.591010 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-v22pp_7f7ae341-a1c6-49f6-825c-4c47b14141f4/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:23:50 crc kubenswrapper[4841]: I0313 10:23:50.787509 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ede0b909-963d-4110-8d86-b09095cbd08c/nova-metadata-log/0.log" Mar 13 10:23:51 crc kubenswrapper[4841]: I0313 10:23:51.586567 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_77aa8bf5-4386-4d85-8cca-75c90d5b2593/mysql-bootstrap/0.log" Mar 13 10:23:51 crc kubenswrapper[4841]: I0313 10:23:51.615338 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d3fdad05-7998-4fe3-a774-61cdaa01e27f/nova-scheduler-scheduler/0.log" Mar 13 10:23:51 crc kubenswrapper[4841]: I0313 10:23:51.860386 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_77aa8bf5-4386-4d85-8cca-75c90d5b2593/mysql-bootstrap/0.log" Mar 13 10:23:51 crc kubenswrapper[4841]: I0313 10:23:51.869395 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_77aa8bf5-4386-4d85-8cca-75c90d5b2593/galera/0.log" Mar 13 10:23:52 crc kubenswrapper[4841]: I0313 10:23:52.100611 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_125cd366-c483-4efa-a55f-85b888bf6266/mysql-bootstrap/0.log" Mar 13 10:23:52 crc kubenswrapper[4841]: I0313 10:23:52.233639 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ede0b909-963d-4110-8d86-b09095cbd08c/nova-metadata-metadata/0.log" Mar 13 10:23:52 crc kubenswrapper[4841]: I0313 10:23:52.246075 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_125cd366-c483-4efa-a55f-85b888bf6266/mysql-bootstrap/0.log" Mar 13 10:23:52 crc kubenswrapper[4841]: I0313 10:23:52.300166 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_125cd366-c483-4efa-a55f-85b888bf6266/galera/0.log" Mar 13 10:23:52 crc kubenswrapper[4841]: I0313 10:23:52.414451 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2d802466-0b65-4820-865c-8ae969af527f/openstackclient/0.log" Mar 13 10:23:52 crc kubenswrapper[4841]: I0313 10:23:52.564931 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-bqlfl_b2bf634d-aa4f-4773-91ee-99616e217c82/ovn-controller/0.log" Mar 13 10:23:52 crc kubenswrapper[4841]: I0313 10:23:52.617629 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-66hkf_fb25dd49-bdd9-46c0-816f-5de963506142/openstack-network-exporter/0.log" Mar 13 10:23:53 crc kubenswrapper[4841]: I0313 10:23:53.241511 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-b2w62_84f12283-3c15-408e-a1a2-691c257434ca/ovsdb-server-init/0.log" Mar 13 10:23:53 crc kubenswrapper[4841]: I0313 10:23:53.450992 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-b2w62_84f12283-3c15-408e-a1a2-691c257434ca/ovs-vswitchd/0.log" Mar 13 10:23:53 crc kubenswrapper[4841]: I0313 10:23:53.463130 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-b2w62_84f12283-3c15-408e-a1a2-691c257434ca/ovsdb-server-init/0.log" Mar 13 10:23:53 crc kubenswrapper[4841]: I0313 10:23:53.504905 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-b2w62_84f12283-3c15-408e-a1a2-691c257434ca/ovsdb-server/0.log" Mar 13 10:23:53 crc kubenswrapper[4841]: I0313 10:23:53.717805 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_32dcacb9-78d3-4dd5-95e4-6d069bddc9e3/openstack-network-exporter/0.log" Mar 13 10:23:53 crc kubenswrapper[4841]: I0313 10:23:53.760730 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-59f4q_de039f4c-0550-4464-b901-a624fac40281/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:23:53 crc kubenswrapper[4841]: I0313 10:23:53.852534 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_32dcacb9-78d3-4dd5-95e4-6d069bddc9e3/ovn-northd/0.log" Mar 13 10:23:54 crc kubenswrapper[4841]: I0313 10:23:54.000963 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_984ac552-8ac1-4cbf-ada9-10a9dc02acd9/openstack-network-exporter/0.log" Mar 13 10:23:54 crc kubenswrapper[4841]: I0313 10:23:54.027603 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_984ac552-8ac1-4cbf-ada9-10a9dc02acd9/ovsdbserver-nb/0.log" Mar 13 10:23:54 crc kubenswrapper[4841]: I0313 10:23:54.207256 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5f3149e4-fc32-4773-ac07-785c8d11888e/openstack-network-exporter/0.log" Mar 13 10:23:54 crc kubenswrapper[4841]: I0313 10:23:54.209404 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5f3149e4-fc32-4773-ac07-785c8d11888e/ovsdbserver-sb/0.log" Mar 13 10:23:54 crc kubenswrapper[4841]: I0313 10:23:54.462415 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-794cb978db-w646s_b4e4f623-6788-4651-95dd-d4fdab2d2b37/placement-api/0.log" Mar 13 10:23:54 crc kubenswrapper[4841]: I0313 10:23:54.473705 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-794cb978db-w646s_b4e4f623-6788-4651-95dd-d4fdab2d2b37/placement-log/0.log" Mar 13 10:23:54 crc kubenswrapper[4841]: I0313 10:23:54.481114 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7/init-config-reloader/0.log" Mar 13 10:23:54 crc kubenswrapper[4841]: I0313 10:23:54.746777 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7/config-reloader/0.log" Mar 13 10:23:54 crc kubenswrapper[4841]: I0313 10:23:54.758511 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7/init-config-reloader/0.log" Mar 13 10:23:54 crc kubenswrapper[4841]: I0313 10:23:54.811821 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7/thanos-sidecar/0.log" Mar 13 10:23:54 crc kubenswrapper[4841]: I0313 10:23:54.813978 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_cb557a31-102e-4dea-b5b9-0fc1f3aaa5c7/prometheus/0.log" Mar 13 10:23:54 crc kubenswrapper[4841]: I0313 10:23:54.955747 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_469aec79-a7a3-4ae1-b00a-94f47a6d4df9/setup-container/0.log" Mar 13 10:23:55 crc kubenswrapper[4841]: I0313 10:23:55.365441 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_469aec79-a7a3-4ae1-b00a-94f47a6d4df9/setup-container/0.log" Mar 13 10:23:55 crc kubenswrapper[4841]: I0313 10:23:55.421338 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_469aec79-a7a3-4ae1-b00a-94f47a6d4df9/rabbitmq/0.log" Mar 13 10:23:55 crc kubenswrapper[4841]: I0313 10:23:55.504599 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b5efe6ff-d5eb-4fa9-9496-1838d05f625a/setup-container/0.log" Mar 13 10:23:55 crc kubenswrapper[4841]: I0313 10:23:55.679607 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b5efe6ff-d5eb-4fa9-9496-1838d05f625a/setup-container/0.log" Mar 13 10:23:55 crc kubenswrapper[4841]: I0313 10:23:55.771787 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-kw7xn_23e2fc94-fce0-4eeb-8a78-15e934c02371/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:23:55 crc kubenswrapper[4841]: I0313 10:23:55.977184 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-t97xr_b538d385-dcf3-477e-b014-4b304c0be557/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:23:56 crc kubenswrapper[4841]: I0313 10:23:56.149058 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-sdg96_413c3ede-4bdb-444c-b90d-5b07c5507a52/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:23:56 crc kubenswrapper[4841]: I0313 10:23:56.334057 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-c9rjj_95faf019-c6d4-4016-87ac-66c7762e56c4/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:23:56 crc kubenswrapper[4841]: I0313 10:23:56.458921 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9wxsh_eb824d89-fddc-4746-8b53-a0f3d5e42082/ssh-known-hosts-edpm-deployment/0.log" Mar 13 10:23:56 crc kubenswrapper[4841]: I0313 10:23:56.717525 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7bbc77c95-pfg84_c7862f13-896f-480f-add9-376c2a96fdd7/proxy-server/0.log" Mar 13 10:23:56 crc kubenswrapper[4841]: I0313 10:23:56.871763 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7bbc77c95-pfg84_c7862f13-896f-480f-add9-376c2a96fdd7/proxy-httpd/0.log" Mar 13 10:23:56 crc kubenswrapper[4841]: I0313 10:23:56.921930 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-sqrfd_f66d8c2c-71a2-4927-a708-4b1412d0243c/swift-ring-rebalance/0.log" Mar 13 10:23:57 crc kubenswrapper[4841]: I0313 10:23:57.141152 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/account-auditor/0.log" Mar 13 10:23:57 crc kubenswrapper[4841]: I0313 10:23:57.198419 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/account-reaper/0.log" Mar 13 10:23:57 crc kubenswrapper[4841]: I0313 10:23:57.367175 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/account-replicator/0.log" Mar 13 10:23:57 crc kubenswrapper[4841]: I0313 10:23:57.409341 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/container-auditor/0.log" Mar 13 10:23:57 crc kubenswrapper[4841]: I0313 10:23:57.429394 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/account-server/0.log" Mar 13 10:23:57 crc kubenswrapper[4841]: I0313 10:23:57.547574 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b5efe6ff-d5eb-4fa9-9496-1838d05f625a/rabbitmq/0.log" Mar 13 10:23:57 crc kubenswrapper[4841]: I0313 10:23:57.593682 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/container-replicator/0.log" Mar 13 10:23:57 crc kubenswrapper[4841]: I0313 10:23:57.612132 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/container-server/0.log" Mar 13 10:23:57 crc kubenswrapper[4841]: I0313 10:23:57.691469 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/container-updater/0.log" Mar 13 10:23:57 crc kubenswrapper[4841]: I0313 10:23:57.791100 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/object-expirer/0.log" Mar 13 10:23:57 crc kubenswrapper[4841]: I0313 10:23:57.800594 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/object-auditor/0.log" Mar 13 10:23:57 crc kubenswrapper[4841]: I0313 10:23:57.854554 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/object-replicator/0.log" Mar 13 10:23:58 crc kubenswrapper[4841]: I0313 10:23:58.029905 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/rsync/0.log" Mar 13 10:23:58 crc kubenswrapper[4841]: I0313 10:23:58.042439 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/object-server/0.log" Mar 13 10:23:58 crc kubenswrapper[4841]: I0313 10:23:58.047160 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/object-updater/0.log" Mar 13 10:23:58 crc kubenswrapper[4841]: I0313 10:23:58.143657 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64f62ec7-7a91-458c-86cb-7658544e4a51/swift-recon-cron/0.log" Mar 13 10:23:58 crc kubenswrapper[4841]: I0313 10:23:58.336019 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-nw4lh_2bdd30cd-856d-44bd-8a1f-b68c7291b0ac/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:23:58 crc kubenswrapper[4841]: I0313 10:23:58.349357 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-sgqnx_a14a214a-62da-44fc-b3d3-749fff9b3645/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 10:24:00 crc kubenswrapper[4841]: I0313 10:24:00.152682 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556624-5x2jt"] Mar 13 10:24:00 crc kubenswrapper[4841]: E0313 10:24:00.153546 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34a21c54-2659-4c9a-93bb-ee9c129677ed" containerName="container-00" Mar 13 10:24:00 crc kubenswrapper[4841]: I0313 10:24:00.153563 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a21c54-2659-4c9a-93bb-ee9c129677ed" containerName="container-00" Mar 13 10:24:00 crc kubenswrapper[4841]: I0313 10:24:00.153891 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="34a21c54-2659-4c9a-93bb-ee9c129677ed" containerName="container-00" Mar 13 10:24:00 crc kubenswrapper[4841]: I0313 10:24:00.154790 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556624-5x2jt" Mar 13 10:24:00 crc kubenswrapper[4841]: I0313 10:24:00.157022 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 10:24:00 crc kubenswrapper[4841]: I0313 10:24:00.157293 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 10:24:00 crc kubenswrapper[4841]: I0313 10:24:00.157501 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 10:24:00 crc kubenswrapper[4841]: I0313 10:24:00.164528 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556624-5x2jt"] Mar 13 10:24:00 crc kubenswrapper[4841]: I0313 10:24:00.326050 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct6z8\" (UniqueName: \"kubernetes.io/projected/e65a6577-0c06-4d17-8399-8847b7c795cc-kube-api-access-ct6z8\") pod \"auto-csr-approver-29556624-5x2jt\" (UID: \"e65a6577-0c06-4d17-8399-8847b7c795cc\") " pod="openshift-infra/auto-csr-approver-29556624-5x2jt" Mar 13 10:24:00 crc kubenswrapper[4841]: I0313 10:24:00.428333 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct6z8\" (UniqueName: \"kubernetes.io/projected/e65a6577-0c06-4d17-8399-8847b7c795cc-kube-api-access-ct6z8\") pod \"auto-csr-approver-29556624-5x2jt\" (UID: \"e65a6577-0c06-4d17-8399-8847b7c795cc\") " pod="openshift-infra/auto-csr-approver-29556624-5x2jt" Mar 13 10:24:00 crc kubenswrapper[4841]: I0313 10:24:00.448769 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct6z8\" (UniqueName: \"kubernetes.io/projected/e65a6577-0c06-4d17-8399-8847b7c795cc-kube-api-access-ct6z8\") pod \"auto-csr-approver-29556624-5x2jt\" (UID: \"e65a6577-0c06-4d17-8399-8847b7c795cc\") " pod="openshift-infra/auto-csr-approver-29556624-5x2jt" Mar 13 10:24:00 crc kubenswrapper[4841]: I0313 10:24:00.474428 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556624-5x2jt" Mar 13 10:24:00 crc kubenswrapper[4841]: I0313 10:24:00.990551 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556624-5x2jt"] Mar 13 10:24:01 crc kubenswrapper[4841]: I0313 10:24:01.002630 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 10:24:01 crc kubenswrapper[4841]: I0313 10:24:01.599377 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556624-5x2jt" event={"ID":"e65a6577-0c06-4d17-8399-8847b7c795cc","Type":"ContainerStarted","Data":"4b49efbfd96581fd525924c409c53504730f8288d8bb6b8113fe60bc27a8b005"} Mar 13 10:24:02 crc kubenswrapper[4841]: I0313 10:24:02.647827 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556624-5x2jt" event={"ID":"e65a6577-0c06-4d17-8399-8847b7c795cc","Type":"ContainerStarted","Data":"c1690b1c1975618d5688398180eb92ef3c17d06179cd7f59a8fa40daed01fc7c"} Mar 13 10:24:02 crc kubenswrapper[4841]: I0313 10:24:02.682107 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556624-5x2jt" podStartSLOduration=1.624483305 podStartE2EDuration="2.682073676s" podCreationTimestamp="2026-03-13 10:24:00 +0000 UTC" firstStartedPulling="2026-03-13 10:24:01.002355989 +0000 UTC m=+4323.732256180" lastFinishedPulling="2026-03-13 10:24:02.05994636 +0000 UTC m=+4324.789846551" observedRunningTime="2026-03-13 10:24:02.680465295 +0000 UTC m=+4325.410365496" watchObservedRunningTime="2026-03-13 10:24:02.682073676 +0000 UTC m=+4325.411973867" Mar 13 10:24:03 crc kubenswrapper[4841]: I0313 10:24:03.686038 4841 generic.go:334] "Generic (PLEG): container finished" podID="e65a6577-0c06-4d17-8399-8847b7c795cc" containerID="c1690b1c1975618d5688398180eb92ef3c17d06179cd7f59a8fa40daed01fc7c" exitCode=0 Mar 13 10:24:03 crc kubenswrapper[4841]: I0313 10:24:03.686530 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556624-5x2jt" event={"ID":"e65a6577-0c06-4d17-8399-8847b7c795cc","Type":"ContainerDied","Data":"c1690b1c1975618d5688398180eb92ef3c17d06179cd7f59a8fa40daed01fc7c"} Mar 13 10:24:04 crc kubenswrapper[4841]: I0313 10:24:04.407545 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 10:24:04 crc kubenswrapper[4841]: I0313 10:24:04.407598 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 10:24:05 crc kubenswrapper[4841]: I0313 10:24:05.111502 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556624-5x2jt" Mar 13 10:24:05 crc kubenswrapper[4841]: I0313 10:24:05.236354 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct6z8\" (UniqueName: \"kubernetes.io/projected/e65a6577-0c06-4d17-8399-8847b7c795cc-kube-api-access-ct6z8\") pod \"e65a6577-0c06-4d17-8399-8847b7c795cc\" (UID: \"e65a6577-0c06-4d17-8399-8847b7c795cc\") " Mar 13 10:24:05 crc kubenswrapper[4841]: I0313 10:24:05.252512 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e65a6577-0c06-4d17-8399-8847b7c795cc-kube-api-access-ct6z8" (OuterVolumeSpecName: "kube-api-access-ct6z8") pod "e65a6577-0c06-4d17-8399-8847b7c795cc" (UID: "e65a6577-0c06-4d17-8399-8847b7c795cc"). InnerVolumeSpecName "kube-api-access-ct6z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:24:05 crc kubenswrapper[4841]: I0313 10:24:05.338368 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct6z8\" (UniqueName: \"kubernetes.io/projected/e65a6577-0c06-4d17-8399-8847b7c795cc-kube-api-access-ct6z8\") on node \"crc\" DevicePath \"\"" Mar 13 10:24:05 crc kubenswrapper[4841]: I0313 10:24:05.707766 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556624-5x2jt" event={"ID":"e65a6577-0c06-4d17-8399-8847b7c795cc","Type":"ContainerDied","Data":"4b49efbfd96581fd525924c409c53504730f8288d8bb6b8113fe60bc27a8b005"} Mar 13 10:24:05 crc kubenswrapper[4841]: I0313 10:24:05.707800 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b49efbfd96581fd525924c409c53504730f8288d8bb6b8113fe60bc27a8b005" Mar 13 10:24:05 crc kubenswrapper[4841]: I0313 10:24:05.707852 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556624-5x2jt" Mar 13 10:24:05 crc kubenswrapper[4841]: I0313 10:24:05.755624 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556618-6gln8"] Mar 13 10:24:05 crc kubenswrapper[4841]: I0313 10:24:05.766279 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556618-6gln8"] Mar 13 10:24:06 crc kubenswrapper[4841]: I0313 10:24:06.010590 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95c6ef1a-2697-4a89-91d8-8e23761f388a" path="/var/lib/kubelet/pods/95c6ef1a-2697-4a89-91d8-8e23761f388a/volumes" Mar 13 10:24:07 crc kubenswrapper[4841]: I0313 10:24:07.451969 4841 scope.go:117] "RemoveContainer" containerID="cc433097b925f45b6e7d151abe992c9429f0a171df09d9aa5269aa6d3e2bcb60" Mar 13 10:24:07 crc kubenswrapper[4841]: I0313 10:24:07.759060 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_831e87d6-8c27-4e98-8b3e-e6be93a93e51/memcached/0.log" Mar 13 10:24:30 crc kubenswrapper[4841]: I0313 10:24:30.073224 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq_a8b2962c-7f7d-4d5b-9982-1668d185c680/util/0.log" Mar 13 10:24:30 crc kubenswrapper[4841]: I0313 10:24:30.315393 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq_a8b2962c-7f7d-4d5b-9982-1668d185c680/pull/0.log" Mar 13 10:24:30 crc kubenswrapper[4841]: I0313 10:24:30.327220 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq_a8b2962c-7f7d-4d5b-9982-1668d185c680/util/0.log" Mar 13 10:24:30 crc kubenswrapper[4841]: I0313 10:24:30.382075 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq_a8b2962c-7f7d-4d5b-9982-1668d185c680/pull/0.log" Mar 13 10:24:30 crc kubenswrapper[4841]: I0313 10:24:30.468045 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq_a8b2962c-7f7d-4d5b-9982-1668d185c680/util/0.log" Mar 13 10:24:30 crc kubenswrapper[4841]: I0313 10:24:30.497581 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq_a8b2962c-7f7d-4d5b-9982-1668d185c680/pull/0.log" Mar 13 10:24:30 crc kubenswrapper[4841]: I0313 10:24:30.514575 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_230f8a5cf963be60d60969973ba2480149b70bdfbb1e6b279e83774d63hrnjq_a8b2962c-7f7d-4d5b-9982-1668d185c680/extract/0.log" Mar 13 10:24:31 crc kubenswrapper[4841]: I0313 10:24:31.500545 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-kc2zl_c6c9dfcd-5298-468b-9de2-0280bf525b61/manager/0.log" Mar 13 10:24:31 crc kubenswrapper[4841]: I0313 10:24:31.823698 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-7lhjg_46ca7c55-bd68-4454-a014-85f81f1b5a60/manager/0.log" Mar 13 10:24:32 crc kubenswrapper[4841]: I0313 10:24:32.162111 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-hm824_e6b9c8a5-3093-4d94-ad46-cd682158fdf8/manager/0.log" Mar 13 10:24:32 crc kubenswrapper[4841]: I0313 10:24:32.583710 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-grn97_bad08b57-dde0-496d-8ea1-5845a52d517a/manager/0.log" Mar 13 10:24:33 crc kubenswrapper[4841]: I0313 10:24:33.054663 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-cgkfz_7bd9dd2f-b4fd-4078-b463-4e970fa6791d/manager/0.log" Mar 13 10:24:33 crc kubenswrapper[4841]: I0313 10:24:33.064245 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-gb4dz_98cec4ba-d672-4627-8d37-46a0684fc284/manager/0.log" Mar 13 10:24:33 crc kubenswrapper[4841]: I0313 10:24:33.397573 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-56nwc_4a76403b-081b-4222-a707-4cd00dd440a0/manager/0.log" Mar 13 10:24:33 crc kubenswrapper[4841]: I0313 10:24:33.408765 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-vtz8l_26236923-39c0-4b46-be0d-61f453533891/manager/0.log" Mar 13 10:24:33 crc kubenswrapper[4841]: I0313 10:24:33.700837 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-9gcr6_44d006b0-b13e-49ce-8ff8-592f3d8798c1/manager/0.log" Mar 13 10:24:33 crc kubenswrapper[4841]: I0313 10:24:33.772512 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-rx4wh_90c1dec3-4daa-4ac6-b95e-209cb8bd9b55/manager/0.log" Mar 13 10:24:34 crc kubenswrapper[4841]: I0313 10:24:34.073307 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-8f2sj_8b77ae90-8ef1-4e98-9d32-319dfdd55a6d/manager/0.log" Mar 13 10:24:34 crc kubenswrapper[4841]: I0313 10:24:34.116292 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-chf9m_f34e0b2d-5c3c-4725-ae0c-760bf98e90d3/manager/0.log" Mar 13 10:24:34 crc kubenswrapper[4841]: I0313 10:24:34.238478 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-8tg7w_b877a309-e752-4f24-90cd-6901973263e3/manager/0.log" Mar 13 10:24:34 crc kubenswrapper[4841]: I0313 10:24:34.316747 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7lccpq_06f0c42d-1674-4913-8a86-1d1749d8d601/manager/0.log" Mar 13 10:24:34 crc kubenswrapper[4841]: I0313 10:24:34.406718 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 10:24:34 crc kubenswrapper[4841]: I0313 10:24:34.406780 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 10:24:34 crc kubenswrapper[4841]: I0313 10:24:34.406838 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-h227v" Mar 13 10:24:34 crc kubenswrapper[4841]: I0313 10:24:34.407695 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1"} pod="openshift-machine-config-operator/machine-config-daemon-h227v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 10:24:34 crc kubenswrapper[4841]: I0313 10:24:34.407763 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" containerID="cri-o://97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" gracePeriod=600 Mar 13 10:24:34 crc kubenswrapper[4841]: E0313 10:24:34.530503 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:24:34 crc kubenswrapper[4841]: I0313 10:24:34.630991 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6bc596d67-h66cj_08c8df77-ecb6-4f32-b9a1-b31bf7a0d1c4/operator/0.log" Mar 13 10:24:34 crc kubenswrapper[4841]: I0313 10:24:34.770092 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-n25n5_7290f225-3489-4643-916d-39a67a36acb2/registry-server/0.log" Mar 13 10:24:34 crc kubenswrapper[4841]: I0313 10:24:34.984855 4841 generic.go:334] "Generic (PLEG): container finished" podID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" exitCode=0 Mar 13 10:24:34 crc kubenswrapper[4841]: I0313 10:24:34.984902 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerDied","Data":"97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1"} Mar 13 10:24:34 crc kubenswrapper[4841]: I0313 10:24:34.984967 4841 scope.go:117] "RemoveContainer" containerID="0205971121b655349d1b80fae5f891f1a0ccfe47f9f479018486b9b62522d2a3" Mar 13 10:24:34 crc kubenswrapper[4841]: I0313 10:24:34.985666 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:24:34 crc kubenswrapper[4841]: E0313 10:24:34.985913 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:24:35 crc kubenswrapper[4841]: I0313 10:24:35.056880 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-j7v7h_3a7d5a0b-0bd7-4735-b182-8a78870050cf/manager/0.log" Mar 13 10:24:35 crc kubenswrapper[4841]: I0313 10:24:35.197491 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-bwvx6_49a6b3bc-4db3-4006-b033-cc9cfa0cb5fc/manager/0.log" Mar 13 10:24:35 crc kubenswrapper[4841]: I0313 10:24:35.269430 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-nd9d9_daeb73dc-4973-4a0b-906d-4afc7f61717c/operator/0.log" Mar 13 10:24:35 crc kubenswrapper[4841]: I0313 10:24:35.527957 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-r42dd_10b15182-cc2b-420b-9fc2-fe3ca6ea38d7/manager/0.log" Mar 13 10:24:35 crc kubenswrapper[4841]: I0313 10:24:35.775530 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-8v6pq_35cf5dc3-b1c0-4481-8f0f-8bca19ecadd1/manager/0.log" Mar 13 10:24:35 crc kubenswrapper[4841]: I0313 10:24:35.998364 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-pvtvx_f5ae27e8-47b9-437c-9506-f51da1b6c9f8/manager/0.log" Mar 13 10:24:36 crc kubenswrapper[4841]: I0313 10:24:36.002531 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-8485bdb9db-mf5lp_17824e5f-18b3-46c0-910a-56e5529e09c3/manager/0.log" Mar 13 10:24:36 crc kubenswrapper[4841]: I0313 10:24:36.810721 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-57ddc6f479-h7khw_2c86df2d-15dc-45f2-aca7-4200fdf36a53/manager/0.log" Mar 13 10:24:41 crc kubenswrapper[4841]: I0313 10:24:41.514133 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-fkfjh_9db8c27e-023c-4e28-a381-24f4438a6add/manager/0.log" Mar 13 10:24:48 crc kubenswrapper[4841]: I0313 10:24:48.003734 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:24:48 crc kubenswrapper[4841]: E0313 10:24:48.004464 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:24:58 crc kubenswrapper[4841]: I0313 10:24:58.636124 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-wh6kv_adeab6d7-21b6-4ef2-afdb-75854f0914c5/control-plane-machine-set-operator/0.log" Mar 13 10:24:58 crc kubenswrapper[4841]: I0313 10:24:58.812135 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qlg79_747a24f9-e654-421a-8da1-0be0aa6ccd9b/kube-rbac-proxy/0.log" Mar 13 10:24:58 crc kubenswrapper[4841]: I0313 10:24:58.847232 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qlg79_747a24f9-e654-421a-8da1-0be0aa6ccd9b/machine-api-operator/0.log" Mar 13 10:25:02 crc kubenswrapper[4841]: I0313 10:25:02.996325 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:25:02 crc kubenswrapper[4841]: E0313 10:25:02.997546 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:25:12 crc kubenswrapper[4841]: I0313 10:25:12.514000 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-crfbh_0f26e065-6f9d-4f61-a645-ea11d7f0eb85/cert-manager-controller/0.log" Mar 13 10:25:12 crc kubenswrapper[4841]: I0313 10:25:12.667721 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-dqzmg_b71327b9-8538-404b-b37d-cfb16da13ce4/cert-manager-cainjector/0.log" Mar 13 10:25:12 crc kubenswrapper[4841]: I0313 10:25:12.752350 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-vqq7j_318e486a-97f3-45fb-84b7-816009810d33/cert-manager-webhook/0.log" Mar 13 10:25:14 crc kubenswrapper[4841]: I0313 10:25:14.364383 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:25:14 crc kubenswrapper[4841]: E0313 10:25:14.365154 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:25:28 crc kubenswrapper[4841]: I0313 10:25:28.005211 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:25:28 crc kubenswrapper[4841]: E0313 10:25:28.006112 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:25:29 crc kubenswrapper[4841]: I0313 10:25:29.904402 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-c69x9_c5cc85e5-25ab-4bc7-ae34-86e1b455a8c2/nmstate-handler/0.log" Mar 13 10:25:30 crc kubenswrapper[4841]: I0313 10:25:30.055614 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-4tq4s_80edf45c-fbb9-4761-995a-010a15e0b1dc/nmstate-console-plugin/0.log" Mar 13 10:25:30 crc kubenswrapper[4841]: I0313 10:25:30.202086 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-c2r92_4b560358-f566-41b2-a5da-89b9b3c173f3/kube-rbac-proxy/0.log" Mar 13 10:25:30 crc kubenswrapper[4841]: I0313 10:25:30.222521 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-c2r92_4b560358-f566-41b2-a5da-89b9b3c173f3/nmstate-metrics/0.log" Mar 13 10:25:30 crc kubenswrapper[4841]: I0313 10:25:30.319582 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-w54ng_1422b359-9a6d-430e-8cb6-5cf498e32422/nmstate-operator/0.log" Mar 13 10:25:30 crc kubenswrapper[4841]: I0313 10:25:30.986681 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-nhjzn_1080bc76-f294-4c2b-8a4b-165d657a4057/nmstate-webhook/0.log" Mar 13 10:25:42 crc kubenswrapper[4841]: I0313 10:25:42.994474 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:25:42 crc kubenswrapper[4841]: E0313 10:25:42.995363 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:25:51 crc kubenswrapper[4841]: I0313 10:25:51.292914 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-z7wl5_08be6515-b41c-481b-ba89-b939e4cfa067/prometheus-operator/0.log" Mar 13 10:25:51 crc kubenswrapper[4841]: I0313 10:25:51.474427 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5586657968-646dp_3304bfd0-8191-45c7-8c50-f16e137a6de8/prometheus-operator-admission-webhook/0.log" Mar 13 10:25:51 crc kubenswrapper[4841]: I0313 10:25:51.572976 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5586657968-njcw5_3fde31d7-89e1-4aa5-a848-2b018eae16b1/prometheus-operator-admission-webhook/0.log" Mar 13 10:25:51 crc kubenswrapper[4841]: I0313 10:25:51.650948 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-5vnkd_64eb3c86-385d-45d5-8dee-df851d8c3a74/operator/0.log" Mar 13 10:25:51 crc kubenswrapper[4841]: I0313 10:25:51.765548 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-r5b4d_b49125a7-562a-421b-b5eb-126312e6e85d/perses-operator/0.log" Mar 13 10:25:56 crc kubenswrapper[4841]: I0313 10:25:56.110152 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:25:56 crc kubenswrapper[4841]: E0313 10:25:56.120451 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:26:00 crc kubenswrapper[4841]: I0313 10:26:00.143977 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556626-525sx"] Mar 13 10:26:00 crc kubenswrapper[4841]: E0313 10:26:00.145347 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e65a6577-0c06-4d17-8399-8847b7c795cc" containerName="oc" Mar 13 10:26:00 crc kubenswrapper[4841]: I0313 10:26:00.145367 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e65a6577-0c06-4d17-8399-8847b7c795cc" containerName="oc" Mar 13 10:26:00 crc kubenswrapper[4841]: I0313 10:26:00.145652 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e65a6577-0c06-4d17-8399-8847b7c795cc" containerName="oc" Mar 13 10:26:00 crc kubenswrapper[4841]: I0313 10:26:00.146697 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556626-525sx" Mar 13 10:26:00 crc kubenswrapper[4841]: I0313 10:26:00.149222 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 10:26:00 crc kubenswrapper[4841]: I0313 10:26:00.149249 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 10:26:00 crc kubenswrapper[4841]: I0313 10:26:00.149276 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 10:26:00 crc kubenswrapper[4841]: I0313 10:26:00.155885 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556626-525sx"] Mar 13 10:26:00 crc kubenswrapper[4841]: I0313 10:26:00.275501 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtbdd\" (UniqueName: \"kubernetes.io/projected/0869bd1f-0261-4f36-8bb2-de2338d1d6fc-kube-api-access-dtbdd\") pod \"auto-csr-approver-29556626-525sx\" (UID: \"0869bd1f-0261-4f36-8bb2-de2338d1d6fc\") " pod="openshift-infra/auto-csr-approver-29556626-525sx" Mar 13 10:26:00 crc kubenswrapper[4841]: I0313 10:26:00.377153 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtbdd\" (UniqueName: \"kubernetes.io/projected/0869bd1f-0261-4f36-8bb2-de2338d1d6fc-kube-api-access-dtbdd\") pod \"auto-csr-approver-29556626-525sx\" (UID: \"0869bd1f-0261-4f36-8bb2-de2338d1d6fc\") " pod="openshift-infra/auto-csr-approver-29556626-525sx" Mar 13 10:26:00 crc kubenswrapper[4841]: I0313 10:26:00.793589 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtbdd\" (UniqueName: \"kubernetes.io/projected/0869bd1f-0261-4f36-8bb2-de2338d1d6fc-kube-api-access-dtbdd\") pod \"auto-csr-approver-29556626-525sx\" (UID: \"0869bd1f-0261-4f36-8bb2-de2338d1d6fc\") " pod="openshift-infra/auto-csr-approver-29556626-525sx" Mar 13 10:26:01 crc kubenswrapper[4841]: I0313 10:26:01.068757 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556626-525sx" Mar 13 10:26:01 crc kubenswrapper[4841]: I0313 10:26:01.518462 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556626-525sx"] Mar 13 10:26:02 crc kubenswrapper[4841]: I0313 10:26:02.212820 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556626-525sx" event={"ID":"0869bd1f-0261-4f36-8bb2-de2338d1d6fc","Type":"ContainerStarted","Data":"e388d37d047bd8c67c643ca53b71459ab274ed8ab8b00c6b55ca0126aac3c1da"} Mar 13 10:26:04 crc kubenswrapper[4841]: I0313 10:26:04.932929 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="77aa8bf5-4386-4d85-8cca-75c90d5b2593" containerName="galera" probeResult="failure" output="command timed out" Mar 13 10:26:04 crc kubenswrapper[4841]: I0313 10:26:04.966627 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="77aa8bf5-4386-4d85-8cca-75c90d5b2593" containerName="galera" probeResult="failure" output="command timed out" Mar 13 10:26:07 crc kubenswrapper[4841]: I0313 10:26:07.775388 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-6x8lt_69db2e0c-e892-4c3c-909b-3f7ba4d650bb/kube-rbac-proxy/0.log" Mar 13 10:26:07 crc kubenswrapper[4841]: I0313 10:26:07.846054 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-6x8lt_69db2e0c-e892-4c3c-909b-3f7ba4d650bb/controller/0.log" Mar 13 10:26:08 crc kubenswrapper[4841]: I0313 10:26:08.024716 4841 generic.go:334] "Generic (PLEG): container finished" podID="0869bd1f-0261-4f36-8bb2-de2338d1d6fc" containerID="65558eab5d03a53db73913970f1fe54d498c44b6c62f18ec741a928af57043c6" exitCode=0 Mar 13 10:26:08 crc kubenswrapper[4841]: I0313 10:26:08.024975 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556626-525sx" event={"ID":"0869bd1f-0261-4f36-8bb2-de2338d1d6fc","Type":"ContainerDied","Data":"65558eab5d03a53db73913970f1fe54d498c44b6c62f18ec741a928af57043c6"} Mar 13 10:26:08 crc kubenswrapper[4841]: I0313 10:26:08.052174 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-frr-files/0.log" Mar 13 10:26:08 crc kubenswrapper[4841]: I0313 10:26:08.235710 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-frr-files/0.log" Mar 13 10:26:08 crc kubenswrapper[4841]: I0313 10:26:08.250896 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-reloader/0.log" Mar 13 10:26:08 crc kubenswrapper[4841]: I0313 10:26:08.263202 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-reloader/0.log" Mar 13 10:26:08 crc kubenswrapper[4841]: I0313 10:26:08.267324 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-metrics/0.log" Mar 13 10:26:08 crc kubenswrapper[4841]: I0313 10:26:08.444495 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-reloader/0.log" Mar 13 10:26:08 crc kubenswrapper[4841]: I0313 10:26:08.455330 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-metrics/0.log" Mar 13 10:26:08 crc kubenswrapper[4841]: I0313 10:26:08.474441 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-frr-files/0.log" Mar 13 10:26:08 crc kubenswrapper[4841]: I0313 10:26:08.502204 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-metrics/0.log" Mar 13 10:26:08 crc kubenswrapper[4841]: I0313 10:26:08.667456 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-reloader/0.log" Mar 13 10:26:08 crc kubenswrapper[4841]: I0313 10:26:08.680979 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/controller/0.log" Mar 13 10:26:08 crc kubenswrapper[4841]: I0313 10:26:08.688313 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-metrics/0.log" Mar 13 10:26:08 crc kubenswrapper[4841]: I0313 10:26:08.705615 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/cp-frr-files/0.log" Mar 13 10:26:08 crc kubenswrapper[4841]: I0313 10:26:08.886928 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/kube-rbac-proxy/0.log" Mar 13 10:26:08 crc kubenswrapper[4841]: I0313 10:26:08.887713 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/frr-metrics/0.log" Mar 13 10:26:08 crc kubenswrapper[4841]: I0313 10:26:08.944007 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/kube-rbac-proxy-frr/0.log" Mar 13 10:26:09 crc kubenswrapper[4841]: I0313 10:26:09.202034 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-qnhkg_1a96f06d-396f-44a0-a357-f8b615676b3f/frr-k8s-webhook-server/0.log" Mar 13 10:26:09 crc kubenswrapper[4841]: I0313 10:26:09.317718 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/reloader/0.log" Mar 13 10:26:09 crc kubenswrapper[4841]: I0313 10:26:09.426950 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556626-525sx" Mar 13 10:26:09 crc kubenswrapper[4841]: I0313 10:26:09.519092 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6d6c4d5946-gtbzk_683811db-740f-4604-b93b-c8134590a46a/manager/0.log" Mar 13 10:26:09 crc kubenswrapper[4841]: I0313 10:26:09.526131 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtbdd\" (UniqueName: \"kubernetes.io/projected/0869bd1f-0261-4f36-8bb2-de2338d1d6fc-kube-api-access-dtbdd\") pod \"0869bd1f-0261-4f36-8bb2-de2338d1d6fc\" (UID: \"0869bd1f-0261-4f36-8bb2-de2338d1d6fc\") " Mar 13 10:26:09 crc kubenswrapper[4841]: I0313 10:26:09.533189 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0869bd1f-0261-4f36-8bb2-de2338d1d6fc-kube-api-access-dtbdd" (OuterVolumeSpecName: "kube-api-access-dtbdd") pod "0869bd1f-0261-4f36-8bb2-de2338d1d6fc" (UID: "0869bd1f-0261-4f36-8bb2-de2338d1d6fc"). InnerVolumeSpecName "kube-api-access-dtbdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:26:09 crc kubenswrapper[4841]: I0313 10:26:09.628192 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtbdd\" (UniqueName: \"kubernetes.io/projected/0869bd1f-0261-4f36-8bb2-de2338d1d6fc-kube-api-access-dtbdd\") on node \"crc\" DevicePath \"\"" Mar 13 10:26:09 crc kubenswrapper[4841]: I0313 10:26:09.676648 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-744cf67d4f-ldddk_2735aa21-2a11-4909-988a-f2add6dae771/webhook-server/0.log" Mar 13 10:26:09 crc kubenswrapper[4841]: I0313 10:26:09.821919 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lpmg6_62891ab6-67e5-4c9e-83b6-aec814f74ca6/kube-rbac-proxy/0.log" Mar 13 10:26:10 crc kubenswrapper[4841]: I0313 10:26:10.061447 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556626-525sx" event={"ID":"0869bd1f-0261-4f36-8bb2-de2338d1d6fc","Type":"ContainerDied","Data":"e388d37d047bd8c67c643ca53b71459ab274ed8ab8b00c6b55ca0126aac3c1da"} Mar 13 10:26:10 crc kubenswrapper[4841]: I0313 10:26:10.061788 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e388d37d047bd8c67c643ca53b71459ab274ed8ab8b00c6b55ca0126aac3c1da" Mar 13 10:26:10 crc kubenswrapper[4841]: I0313 10:26:10.061553 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556626-525sx" Mar 13 10:26:10 crc kubenswrapper[4841]: I0313 10:26:10.431865 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lpmg6_62891ab6-67e5-4c9e-83b6-aec814f74ca6/speaker/0.log" Mar 13 10:26:10 crc kubenswrapper[4841]: I0313 10:26:10.510954 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556620-g729x"] Mar 13 10:26:10 crc kubenswrapper[4841]: I0313 10:26:10.522795 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556620-g729x"] Mar 13 10:26:10 crc kubenswrapper[4841]: I0313 10:26:10.817821 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lpxn6_8a43b722-1514-4a29-8935-2f1444488222/frr/0.log" Mar 13 10:26:10 crc kubenswrapper[4841]: I0313 10:26:10.995049 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:26:10 crc kubenswrapper[4841]: E0313 10:26:10.995433 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:26:12 crc kubenswrapper[4841]: I0313 10:26:12.006090 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3" path="/var/lib/kubelet/pods/a5eaa2c9-4a0a-4a53-93d3-cfabd8039cd3/volumes" Mar 13 10:26:22 crc kubenswrapper[4841]: I0313 10:26:22.995486 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:26:22 crc kubenswrapper[4841]: E0313 10:26:22.996318 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:26:23 crc kubenswrapper[4841]: I0313 10:26:23.613473 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc_b11048a2-12d1-437e-80b5-05e10ccc4b50/util/0.log" Mar 13 10:26:23 crc kubenswrapper[4841]: I0313 10:26:23.806074 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc_b11048a2-12d1-437e-80b5-05e10ccc4b50/pull/0.log" Mar 13 10:26:23 crc kubenswrapper[4841]: I0313 10:26:23.825785 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc_b11048a2-12d1-437e-80b5-05e10ccc4b50/util/0.log" Mar 13 10:26:23 crc kubenswrapper[4841]: I0313 10:26:23.846744 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc_b11048a2-12d1-437e-80b5-05e10ccc4b50/pull/0.log" Mar 13 10:26:24 crc kubenswrapper[4841]: I0313 10:26:24.031922 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc_b11048a2-12d1-437e-80b5-05e10ccc4b50/util/0.log" Mar 13 10:26:24 crc kubenswrapper[4841]: I0313 10:26:24.032236 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc_b11048a2-12d1-437e-80b5-05e10ccc4b50/extract/0.log" Mar 13 10:26:24 crc kubenswrapper[4841]: I0313 10:26:24.047364 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8742v2dc_b11048a2-12d1-437e-80b5-05e10ccc4b50/pull/0.log" Mar 13 10:26:24 crc kubenswrapper[4841]: I0313 10:26:24.174207 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s_8b79260a-a276-45fa-abfd-5d471f82142a/util/0.log" Mar 13 10:26:24 crc kubenswrapper[4841]: I0313 10:26:24.358467 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s_8b79260a-a276-45fa-abfd-5d471f82142a/util/0.log" Mar 13 10:26:24 crc kubenswrapper[4841]: I0313 10:26:24.378648 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s_8b79260a-a276-45fa-abfd-5d471f82142a/pull/0.log" Mar 13 10:26:24 crc kubenswrapper[4841]: I0313 10:26:24.383453 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s_8b79260a-a276-45fa-abfd-5d471f82142a/pull/0.log" Mar 13 10:26:24 crc kubenswrapper[4841]: I0313 10:26:24.547938 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s_8b79260a-a276-45fa-abfd-5d471f82142a/pull/0.log" Mar 13 10:26:24 crc kubenswrapper[4841]: I0313 10:26:24.553735 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s_8b79260a-a276-45fa-abfd-5d471f82142a/util/0.log" Mar 13 10:26:24 crc kubenswrapper[4841]: I0313 10:26:24.572626 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1p2h6s_8b79260a-a276-45fa-abfd-5d471f82142a/extract/0.log" Mar 13 10:26:24 crc kubenswrapper[4841]: I0313 10:26:24.702018 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6_646b8b76-d9eb-4e25-bbe5-b6d42b9f0961/util/0.log" Mar 13 10:26:24 crc kubenswrapper[4841]: I0313 10:26:24.891080 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6_646b8b76-d9eb-4e25-bbe5-b6d42b9f0961/pull/0.log" Mar 13 10:26:24 crc kubenswrapper[4841]: I0313 10:26:24.908883 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6_646b8b76-d9eb-4e25-bbe5-b6d42b9f0961/util/0.log" Mar 13 10:26:24 crc kubenswrapper[4841]: I0313 10:26:24.943528 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6_646b8b76-d9eb-4e25-bbe5-b6d42b9f0961/pull/0.log" Mar 13 10:26:25 crc kubenswrapper[4841]: I0313 10:26:25.118086 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6_646b8b76-d9eb-4e25-bbe5-b6d42b9f0961/pull/0.log" Mar 13 10:26:25 crc kubenswrapper[4841]: I0313 10:26:25.132777 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6_646b8b76-d9eb-4e25-bbe5-b6d42b9f0961/util/0.log" Mar 13 10:26:25 crc kubenswrapper[4841]: I0313 10:26:25.153015 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f082c2x6_646b8b76-d9eb-4e25-bbe5-b6d42b9f0961/extract/0.log" Mar 13 10:26:25 crc kubenswrapper[4841]: I0313 10:26:25.283501 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-96j6p_44d8c242-fbc8-4c6a-93b1-146498533256/extract-utilities/0.log" Mar 13 10:26:25 crc kubenswrapper[4841]: I0313 10:26:25.489048 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-96j6p_44d8c242-fbc8-4c6a-93b1-146498533256/extract-utilities/0.log" Mar 13 10:26:25 crc kubenswrapper[4841]: I0313 10:26:25.505530 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-96j6p_44d8c242-fbc8-4c6a-93b1-146498533256/extract-content/0.log" Mar 13 10:26:25 crc kubenswrapper[4841]: I0313 10:26:25.525368 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-96j6p_44d8c242-fbc8-4c6a-93b1-146498533256/extract-content/0.log" Mar 13 10:26:25 crc kubenswrapper[4841]: I0313 10:26:25.671389 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-96j6p_44d8c242-fbc8-4c6a-93b1-146498533256/extract-utilities/0.log" Mar 13 10:26:25 crc kubenswrapper[4841]: I0313 10:26:25.676045 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-96j6p_44d8c242-fbc8-4c6a-93b1-146498533256/extract-content/0.log" Mar 13 10:26:25 crc kubenswrapper[4841]: I0313 10:26:25.879276 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szx4f_eb0bb4de-58c3-4d6e-a22d-735e9346f228/extract-utilities/0.log" Mar 13 10:26:26 crc kubenswrapper[4841]: I0313 10:26:26.119807 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szx4f_eb0bb4de-58c3-4d6e-a22d-735e9346f228/extract-utilities/0.log" Mar 13 10:26:26 crc kubenswrapper[4841]: I0313 10:26:26.218569 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szx4f_eb0bb4de-58c3-4d6e-a22d-735e9346f228/extract-content/0.log" Mar 13 10:26:26 crc kubenswrapper[4841]: I0313 10:26:26.262928 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szx4f_eb0bb4de-58c3-4d6e-a22d-735e9346f228/extract-content/0.log" Mar 13 10:26:26 crc kubenswrapper[4841]: I0313 10:26:26.296654 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-96j6p_44d8c242-fbc8-4c6a-93b1-146498533256/registry-server/0.log" Mar 13 10:26:26 crc kubenswrapper[4841]: I0313 10:26:26.358702 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szx4f_eb0bb4de-58c3-4d6e-a22d-735e9346f228/extract-utilities/0.log" Mar 13 10:26:26 crc kubenswrapper[4841]: I0313 10:26:26.432187 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szx4f_eb0bb4de-58c3-4d6e-a22d-735e9346f228/extract-content/0.log" Mar 13 10:26:26 crc kubenswrapper[4841]: I0313 10:26:26.633451 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4n6sl_f4c8d5c8-ff84-4340-9dc3-4c7a675f1a45/marketplace-operator/0.log" Mar 13 10:26:26 crc kubenswrapper[4841]: I0313 10:26:26.774446 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgpqr_6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c/extract-utilities/0.log" Mar 13 10:26:26 crc kubenswrapper[4841]: I0313 10:26:26.808362 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szx4f_eb0bb4de-58c3-4d6e-a22d-735e9346f228/registry-server/0.log" Mar 13 10:26:26 crc kubenswrapper[4841]: I0313 10:26:26.987532 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgpqr_6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c/extract-utilities/0.log" Mar 13 10:26:27 crc kubenswrapper[4841]: I0313 10:26:27.006587 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgpqr_6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c/extract-content/0.log" Mar 13 10:26:27 crc kubenswrapper[4841]: I0313 10:26:27.026111 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgpqr_6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c/extract-content/0.log" Mar 13 10:26:27 crc kubenswrapper[4841]: I0313 10:26:27.214105 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgpqr_6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c/extract-utilities/0.log" Mar 13 10:26:27 crc kubenswrapper[4841]: I0313 10:26:27.242334 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pqqf_5099aa18-a4a1-40d1-b8c2-dc8a5a26e912/extract-utilities/0.log" Mar 13 10:26:27 crc kubenswrapper[4841]: I0313 10:26:27.290104 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgpqr_6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c/extract-content/0.log" Mar 13 10:26:27 crc kubenswrapper[4841]: I0313 10:26:27.362069 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mgpqr_6a2d33d3-6ede-4c64-a1bd-19cda08f6f9c/registry-server/0.log" Mar 13 10:26:27 crc kubenswrapper[4841]: I0313 10:26:27.678904 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pqqf_5099aa18-a4a1-40d1-b8c2-dc8a5a26e912/extract-content/0.log" Mar 13 10:26:27 crc kubenswrapper[4841]: I0313 10:26:27.678983 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pqqf_5099aa18-a4a1-40d1-b8c2-dc8a5a26e912/extract-content/0.log" Mar 13 10:26:27 crc kubenswrapper[4841]: I0313 10:26:27.680165 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pqqf_5099aa18-a4a1-40d1-b8c2-dc8a5a26e912/extract-utilities/0.log" Mar 13 10:26:27 crc kubenswrapper[4841]: I0313 10:26:27.848547 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pqqf_5099aa18-a4a1-40d1-b8c2-dc8a5a26e912/extract-utilities/0.log" Mar 13 10:26:27 crc kubenswrapper[4841]: I0313 10:26:27.874524 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pqqf_5099aa18-a4a1-40d1-b8c2-dc8a5a26e912/extract-content/0.log" Mar 13 10:26:28 crc kubenswrapper[4841]: I0313 10:26:28.536452 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pqqf_5099aa18-a4a1-40d1-b8c2-dc8a5a26e912/registry-server/0.log" Mar 13 10:26:35 crc kubenswrapper[4841]: I0313 10:26:35.995575 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:26:35 crc kubenswrapper[4841]: E0313 10:26:35.996161 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:26:41 crc kubenswrapper[4841]: I0313 10:26:41.735617 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5586657968-njcw5_3fde31d7-89e1-4aa5-a848-2b018eae16b1/prometheus-operator-admission-webhook/0.log" Mar 13 10:26:41 crc kubenswrapper[4841]: I0313 10:26:41.780945 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5586657968-646dp_3304bfd0-8191-45c7-8c50-f16e137a6de8/prometheus-operator-admission-webhook/0.log" Mar 13 10:26:41 crc kubenswrapper[4841]: I0313 10:26:41.798394 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-z7wl5_08be6515-b41c-481b-ba89-b939e4cfa067/prometheus-operator/0.log" Mar 13 10:26:41 crc kubenswrapper[4841]: I0313 10:26:41.955420 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-5vnkd_64eb3c86-385d-45d5-8dee-df851d8c3a74/operator/0.log" Mar 13 10:26:41 crc kubenswrapper[4841]: I0313 10:26:41.957488 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-r5b4d_b49125a7-562a-421b-b5eb-126312e6e85d/perses-operator/0.log" Mar 13 10:26:48 crc kubenswrapper[4841]: I0313 10:26:48.013675 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:26:48 crc kubenswrapper[4841]: E0313 10:26:48.014380 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:26:59 crc kubenswrapper[4841]: I0313 10:26:59.994747 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:26:59 crc kubenswrapper[4841]: E0313 10:26:59.995601 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:27:07 crc kubenswrapper[4841]: I0313 10:27:07.571603 4841 scope.go:117] "RemoveContainer" containerID="5f755d7dae9fc35c478ecf31122c16c56fe2f6447f72fea0463d1ca63d232141" Mar 13 10:27:14 crc kubenswrapper[4841]: I0313 10:27:14.995717 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:27:14 crc kubenswrapper[4841]: E0313 10:27:14.996504 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:27:28 crc kubenswrapper[4841]: I0313 10:27:28.007650 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:27:28 crc kubenswrapper[4841]: E0313 10:27:28.008444 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:27:39 crc kubenswrapper[4841]: I0313 10:27:39.994753 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:27:39 crc kubenswrapper[4841]: E0313 10:27:39.995486 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:27:51 crc kubenswrapper[4841]: I0313 10:27:51.995080 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:27:51 crc kubenswrapper[4841]: E0313 10:27:51.995962 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:28:00 crc kubenswrapper[4841]: I0313 10:28:00.161433 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556628-ddrkg"] Mar 13 10:28:00 crc kubenswrapper[4841]: E0313 10:28:00.162529 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0869bd1f-0261-4f36-8bb2-de2338d1d6fc" containerName="oc" Mar 13 10:28:00 crc kubenswrapper[4841]: I0313 10:28:00.162545 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0869bd1f-0261-4f36-8bb2-de2338d1d6fc" containerName="oc" Mar 13 10:28:00 crc kubenswrapper[4841]: I0313 10:28:00.162781 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0869bd1f-0261-4f36-8bb2-de2338d1d6fc" containerName="oc" Mar 13 10:28:00 crc kubenswrapper[4841]: I0313 10:28:00.163662 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556628-ddrkg" Mar 13 10:28:00 crc kubenswrapper[4841]: I0313 10:28:00.166068 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 10:28:00 crc kubenswrapper[4841]: I0313 10:28:00.166347 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 10:28:00 crc kubenswrapper[4841]: I0313 10:28:00.173851 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556628-ddrkg"] Mar 13 10:28:00 crc kubenswrapper[4841]: I0313 10:28:00.176812 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 10:28:00 crc kubenswrapper[4841]: I0313 10:28:00.273153 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvpk5\" (UniqueName: \"kubernetes.io/projected/f29a8efb-ff9a-4db7-b7db-0b18d3c46533-kube-api-access-pvpk5\") pod \"auto-csr-approver-29556628-ddrkg\" (UID: \"f29a8efb-ff9a-4db7-b7db-0b18d3c46533\") " pod="openshift-infra/auto-csr-approver-29556628-ddrkg" Mar 13 10:28:00 crc kubenswrapper[4841]: I0313 10:28:00.375550 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvpk5\" (UniqueName: \"kubernetes.io/projected/f29a8efb-ff9a-4db7-b7db-0b18d3c46533-kube-api-access-pvpk5\") pod \"auto-csr-approver-29556628-ddrkg\" (UID: \"f29a8efb-ff9a-4db7-b7db-0b18d3c46533\") " pod="openshift-infra/auto-csr-approver-29556628-ddrkg" Mar 13 10:28:00 crc kubenswrapper[4841]: I0313 10:28:00.409195 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvpk5\" (UniqueName: \"kubernetes.io/projected/f29a8efb-ff9a-4db7-b7db-0b18d3c46533-kube-api-access-pvpk5\") pod \"auto-csr-approver-29556628-ddrkg\" (UID: \"f29a8efb-ff9a-4db7-b7db-0b18d3c46533\") " pod="openshift-infra/auto-csr-approver-29556628-ddrkg" Mar 13 10:28:00 crc kubenswrapper[4841]: I0313 10:28:00.488674 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556628-ddrkg" Mar 13 10:28:00 crc kubenswrapper[4841]: I0313 10:28:00.955801 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556628-ddrkg"] Mar 13 10:28:01 crc kubenswrapper[4841]: I0313 10:28:01.054655 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556628-ddrkg" event={"ID":"f29a8efb-ff9a-4db7-b7db-0b18d3c46533","Type":"ContainerStarted","Data":"598743f6b926d2205c128c2b9455daf1384c49284a72fae363a5c4f23da932d6"} Mar 13 10:28:03 crc kubenswrapper[4841]: I0313 10:28:03.072792 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556628-ddrkg" event={"ID":"f29a8efb-ff9a-4db7-b7db-0b18d3c46533","Type":"ContainerStarted","Data":"708445476e76ff8a6a84f18e6ca879a523d18db5e8a4f9efca77a7176af8c4cf"} Mar 13 10:28:03 crc kubenswrapper[4841]: I0313 10:28:03.287533 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556628-ddrkg" podStartSLOduration=2.298389035 podStartE2EDuration="3.287511921s" podCreationTimestamp="2026-03-13 10:28:00 +0000 UTC" firstStartedPulling="2026-03-13 10:28:00.963884196 +0000 UTC m=+4563.693784387" lastFinishedPulling="2026-03-13 10:28:01.953007082 +0000 UTC m=+4564.682907273" observedRunningTime="2026-03-13 10:28:03.283642929 +0000 UTC m=+4566.013543140" watchObservedRunningTime="2026-03-13 10:28:03.287511921 +0000 UTC m=+4566.017412112" Mar 13 10:28:04 crc kubenswrapper[4841]: I0313 10:28:04.083189 4841 generic.go:334] "Generic (PLEG): container finished" podID="f29a8efb-ff9a-4db7-b7db-0b18d3c46533" containerID="708445476e76ff8a6a84f18e6ca879a523d18db5e8a4f9efca77a7176af8c4cf" exitCode=0 Mar 13 10:28:04 crc kubenswrapper[4841]: I0313 10:28:04.083236 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556628-ddrkg" event={"ID":"f29a8efb-ff9a-4db7-b7db-0b18d3c46533","Type":"ContainerDied","Data":"708445476e76ff8a6a84f18e6ca879a523d18db5e8a4f9efca77a7176af8c4cf"} Mar 13 10:28:04 crc kubenswrapper[4841]: I0313 10:28:04.995909 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:28:04 crc kubenswrapper[4841]: E0313 10:28:04.996568 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:28:05 crc kubenswrapper[4841]: I0313 10:28:05.473063 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556628-ddrkg" Mar 13 10:28:05 crc kubenswrapper[4841]: I0313 10:28:05.634839 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvpk5\" (UniqueName: \"kubernetes.io/projected/f29a8efb-ff9a-4db7-b7db-0b18d3c46533-kube-api-access-pvpk5\") pod \"f29a8efb-ff9a-4db7-b7db-0b18d3c46533\" (UID: \"f29a8efb-ff9a-4db7-b7db-0b18d3c46533\") " Mar 13 10:28:05 crc kubenswrapper[4841]: I0313 10:28:05.640968 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f29a8efb-ff9a-4db7-b7db-0b18d3c46533-kube-api-access-pvpk5" (OuterVolumeSpecName: "kube-api-access-pvpk5") pod "f29a8efb-ff9a-4db7-b7db-0b18d3c46533" (UID: "f29a8efb-ff9a-4db7-b7db-0b18d3c46533"). InnerVolumeSpecName "kube-api-access-pvpk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:28:05 crc kubenswrapper[4841]: I0313 10:28:05.737447 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvpk5\" (UniqueName: \"kubernetes.io/projected/f29a8efb-ff9a-4db7-b7db-0b18d3c46533-kube-api-access-pvpk5\") on node \"crc\" DevicePath \"\"" Mar 13 10:28:06 crc kubenswrapper[4841]: I0313 10:28:06.136836 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556628-ddrkg" event={"ID":"f29a8efb-ff9a-4db7-b7db-0b18d3c46533","Type":"ContainerDied","Data":"598743f6b926d2205c128c2b9455daf1384c49284a72fae363a5c4f23da932d6"} Mar 13 10:28:06 crc kubenswrapper[4841]: I0313 10:28:06.136872 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="598743f6b926d2205c128c2b9455daf1384c49284a72fae363a5c4f23da932d6" Mar 13 10:28:06 crc kubenswrapper[4841]: I0313 10:28:06.136875 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556628-ddrkg" Mar 13 10:28:06 crc kubenswrapper[4841]: I0313 10:28:06.562561 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556622-8vj8d"] Mar 13 10:28:06 crc kubenswrapper[4841]: I0313 10:28:06.572080 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556622-8vj8d"] Mar 13 10:28:08 crc kubenswrapper[4841]: I0313 10:28:08.014204 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0ee97d0-aee2-40fc-96a7-d022734397b2" path="/var/lib/kubelet/pods/d0ee97d0-aee2-40fc-96a7-d022734397b2/volumes" Mar 13 10:28:18 crc kubenswrapper[4841]: I0313 10:28:18.997398 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:28:18 crc kubenswrapper[4841]: E0313 10:28:18.998107 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:28:22 crc kubenswrapper[4841]: I0313 10:28:22.293099 4841 generic.go:334] "Generic (PLEG): container finished" podID="2540fffd-8f6f-459e-a116-ce2e2c095448" containerID="8e5dc5a9153b3abefa2b8e46fba2306dc3c0f0825dc8ba90730a784b3ff8c265" exitCode=0 Mar 13 10:28:22 crc kubenswrapper[4841]: I0313 10:28:22.293192 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9mxpq/must-gather-s59mc" event={"ID":"2540fffd-8f6f-459e-a116-ce2e2c095448","Type":"ContainerDied","Data":"8e5dc5a9153b3abefa2b8e46fba2306dc3c0f0825dc8ba90730a784b3ff8c265"} Mar 13 10:28:22 crc kubenswrapper[4841]: I0313 10:28:22.294712 4841 scope.go:117] "RemoveContainer" containerID="8e5dc5a9153b3abefa2b8e46fba2306dc3c0f0825dc8ba90730a784b3ff8c265" Mar 13 10:28:22 crc kubenswrapper[4841]: I0313 10:28:22.684197 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9mxpq_must-gather-s59mc_2540fffd-8f6f-459e-a116-ce2e2c095448/gather/0.log" Mar 13 10:28:29 crc kubenswrapper[4841]: I0313 10:28:29.994668 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:28:29 crc kubenswrapper[4841]: E0313 10:28:29.995595 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:28:34 crc kubenswrapper[4841]: I0313 10:28:34.778653 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9mxpq/must-gather-s59mc"] Mar 13 10:28:34 crc kubenswrapper[4841]: I0313 10:28:34.779455 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-9mxpq/must-gather-s59mc" podUID="2540fffd-8f6f-459e-a116-ce2e2c095448" containerName="copy" containerID="cri-o://5f0def84d86f11057b964adcd9ae4195af218b44c28ab2a99fba345cac48660d" gracePeriod=2 Mar 13 10:28:34 crc kubenswrapper[4841]: I0313 10:28:34.789278 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9mxpq/must-gather-s59mc"] Mar 13 10:28:35 crc kubenswrapper[4841]: I0313 10:28:35.150849 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9mxpq_must-gather-s59mc_2540fffd-8f6f-459e-a116-ce2e2c095448/copy/0.log" Mar 13 10:28:35 crc kubenswrapper[4841]: I0313 10:28:35.151731 4841 generic.go:334] "Generic (PLEG): container finished" podID="2540fffd-8f6f-459e-a116-ce2e2c095448" containerID="5f0def84d86f11057b964adcd9ae4195af218b44c28ab2a99fba345cac48660d" exitCode=143 Mar 13 10:28:35 crc kubenswrapper[4841]: I0313 10:28:35.251195 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9mxpq_must-gather-s59mc_2540fffd-8f6f-459e-a116-ce2e2c095448/copy/0.log" Mar 13 10:28:35 crc kubenswrapper[4841]: I0313 10:28:35.251562 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9mxpq/must-gather-s59mc" Mar 13 10:28:35 crc kubenswrapper[4841]: I0313 10:28:35.351131 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psxq9\" (UniqueName: \"kubernetes.io/projected/2540fffd-8f6f-459e-a116-ce2e2c095448-kube-api-access-psxq9\") pod \"2540fffd-8f6f-459e-a116-ce2e2c095448\" (UID: \"2540fffd-8f6f-459e-a116-ce2e2c095448\") " Mar 13 10:28:35 crc kubenswrapper[4841]: I0313 10:28:35.351286 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2540fffd-8f6f-459e-a116-ce2e2c095448-must-gather-output\") pod \"2540fffd-8f6f-459e-a116-ce2e2c095448\" (UID: \"2540fffd-8f6f-459e-a116-ce2e2c095448\") " Mar 13 10:28:35 crc kubenswrapper[4841]: I0313 10:28:35.357342 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2540fffd-8f6f-459e-a116-ce2e2c095448-kube-api-access-psxq9" (OuterVolumeSpecName: "kube-api-access-psxq9") pod "2540fffd-8f6f-459e-a116-ce2e2c095448" (UID: "2540fffd-8f6f-459e-a116-ce2e2c095448"). InnerVolumeSpecName "kube-api-access-psxq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:28:35 crc kubenswrapper[4841]: I0313 10:28:35.453508 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psxq9\" (UniqueName: \"kubernetes.io/projected/2540fffd-8f6f-459e-a116-ce2e2c095448-kube-api-access-psxq9\") on node \"crc\" DevicePath \"\"" Mar 13 10:28:35 crc kubenswrapper[4841]: I0313 10:28:35.519314 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2540fffd-8f6f-459e-a116-ce2e2c095448-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2540fffd-8f6f-459e-a116-ce2e2c095448" (UID: "2540fffd-8f6f-459e-a116-ce2e2c095448"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:28:35 crc kubenswrapper[4841]: I0313 10:28:35.554846 4841 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2540fffd-8f6f-459e-a116-ce2e2c095448-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 13 10:28:36 crc kubenswrapper[4841]: I0313 10:28:36.005963 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2540fffd-8f6f-459e-a116-ce2e2c095448" path="/var/lib/kubelet/pods/2540fffd-8f6f-459e-a116-ce2e2c095448/volumes" Mar 13 10:28:36 crc kubenswrapper[4841]: I0313 10:28:36.168357 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9mxpq_must-gather-s59mc_2540fffd-8f6f-459e-a116-ce2e2c095448/copy/0.log" Mar 13 10:28:36 crc kubenswrapper[4841]: I0313 10:28:36.168791 4841 scope.go:117] "RemoveContainer" containerID="5f0def84d86f11057b964adcd9ae4195af218b44c28ab2a99fba345cac48660d" Mar 13 10:28:36 crc kubenswrapper[4841]: I0313 10:28:36.169028 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9mxpq/must-gather-s59mc" Mar 13 10:28:36 crc kubenswrapper[4841]: I0313 10:28:36.190864 4841 scope.go:117] "RemoveContainer" containerID="8e5dc5a9153b3abefa2b8e46fba2306dc3c0f0825dc8ba90730a784b3ff8c265" Mar 13 10:28:42 crc kubenswrapper[4841]: I0313 10:28:42.995345 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:28:42 crc kubenswrapper[4841]: E0313 10:28:42.996796 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:28:53 crc kubenswrapper[4841]: I0313 10:28:53.997073 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:28:53 crc kubenswrapper[4841]: E0313 10:28:53.997792 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:29:07 crc kubenswrapper[4841]: I0313 10:29:07.676563 4841 scope.go:117] "RemoveContainer" containerID="11f3aff671610bb424282d1b8a02085796b950d11c01749889bce84622324af9" Mar 13 10:29:07 crc kubenswrapper[4841]: I0313 10:29:07.700639 4841 scope.go:117] "RemoveContainer" containerID="864c8629ae70baf711d32146407ec5996c1ef6a331d65ed00083b44609ca35de" Mar 13 10:29:08 crc kubenswrapper[4841]: I0313 10:29:08.995586 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:29:08 crc kubenswrapper[4841]: E0313 10:29:08.996093 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.258397 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rrv7t"] Mar 13 10:29:23 crc kubenswrapper[4841]: E0313 10:29:23.260159 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2540fffd-8f6f-459e-a116-ce2e2c095448" containerName="copy" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.260183 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2540fffd-8f6f-459e-a116-ce2e2c095448" containerName="copy" Mar 13 10:29:23 crc kubenswrapper[4841]: E0313 10:29:23.260214 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2540fffd-8f6f-459e-a116-ce2e2c095448" containerName="gather" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.260223 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2540fffd-8f6f-459e-a116-ce2e2c095448" containerName="gather" Mar 13 10:29:23 crc kubenswrapper[4841]: E0313 10:29:23.260284 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29a8efb-ff9a-4db7-b7db-0b18d3c46533" containerName="oc" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.260295 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29a8efb-ff9a-4db7-b7db-0b18d3c46533" containerName="oc" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.260590 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="2540fffd-8f6f-459e-a116-ce2e2c095448" containerName="gather" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.260625 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="2540fffd-8f6f-459e-a116-ce2e2c095448" containerName="copy" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.260641 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f29a8efb-ff9a-4db7-b7db-0b18d3c46533" containerName="oc" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.265701 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rrv7t" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.276280 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rrv7t"] Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.361314 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk9fz\" (UniqueName: \"kubernetes.io/projected/a8c8a4ca-c69a-4050-b880-e8fd08c61866-kube-api-access-tk9fz\") pod \"community-operators-rrv7t\" (UID: \"a8c8a4ca-c69a-4050-b880-e8fd08c61866\") " pod="openshift-marketplace/community-operators-rrv7t" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.361387 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8c8a4ca-c69a-4050-b880-e8fd08c61866-catalog-content\") pod \"community-operators-rrv7t\" (UID: \"a8c8a4ca-c69a-4050-b880-e8fd08c61866\") " pod="openshift-marketplace/community-operators-rrv7t" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.361837 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8c8a4ca-c69a-4050-b880-e8fd08c61866-utilities\") pod \"community-operators-rrv7t\" (UID: \"a8c8a4ca-c69a-4050-b880-e8fd08c61866\") " pod="openshift-marketplace/community-operators-rrv7t" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.463865 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8c8a4ca-c69a-4050-b880-e8fd08c61866-catalog-content\") pod \"community-operators-rrv7t\" (UID: \"a8c8a4ca-c69a-4050-b880-e8fd08c61866\") " pod="openshift-marketplace/community-operators-rrv7t" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.464039 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8c8a4ca-c69a-4050-b880-e8fd08c61866-utilities\") pod \"community-operators-rrv7t\" (UID: \"a8c8a4ca-c69a-4050-b880-e8fd08c61866\") " pod="openshift-marketplace/community-operators-rrv7t" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.464177 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk9fz\" (UniqueName: \"kubernetes.io/projected/a8c8a4ca-c69a-4050-b880-e8fd08c61866-kube-api-access-tk9fz\") pod \"community-operators-rrv7t\" (UID: \"a8c8a4ca-c69a-4050-b880-e8fd08c61866\") " pod="openshift-marketplace/community-operators-rrv7t" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.464435 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8c8a4ca-c69a-4050-b880-e8fd08c61866-catalog-content\") pod \"community-operators-rrv7t\" (UID: \"a8c8a4ca-c69a-4050-b880-e8fd08c61866\") " pod="openshift-marketplace/community-operators-rrv7t" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.464527 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8c8a4ca-c69a-4050-b880-e8fd08c61866-utilities\") pod \"community-operators-rrv7t\" (UID: \"a8c8a4ca-c69a-4050-b880-e8fd08c61866\") " pod="openshift-marketplace/community-operators-rrv7t" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.485188 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk9fz\" (UniqueName: \"kubernetes.io/projected/a8c8a4ca-c69a-4050-b880-e8fd08c61866-kube-api-access-tk9fz\") pod \"community-operators-rrv7t\" (UID: \"a8c8a4ca-c69a-4050-b880-e8fd08c61866\") " pod="openshift-marketplace/community-operators-rrv7t" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.586488 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rrv7t" Mar 13 10:29:23 crc kubenswrapper[4841]: I0313 10:29:23.996851 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:29:23 crc kubenswrapper[4841]: E0313 10:29:23.997423 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-h227v_openshift-machine-config-operator(e49b836b-f6cf-4cee-b1be-6bd7864fb7f2)\"" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" Mar 13 10:29:24 crc kubenswrapper[4841]: I0313 10:29:24.107945 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rrv7t"] Mar 13 10:29:24 crc kubenswrapper[4841]: W0313 10:29:24.111820 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8c8a4ca_c69a_4050_b880_e8fd08c61866.slice/crio-90957bf088ecadc50e80cdc74aad6005706a84b89762f03b94eb65fe3d2ed020 WatchSource:0}: Error finding container 90957bf088ecadc50e80cdc74aad6005706a84b89762f03b94eb65fe3d2ed020: Status 404 returned error can't find the container with id 90957bf088ecadc50e80cdc74aad6005706a84b89762f03b94eb65fe3d2ed020 Mar 13 10:29:24 crc kubenswrapper[4841]: I0313 10:29:24.978808 4841 generic.go:334] "Generic (PLEG): container finished" podID="a8c8a4ca-c69a-4050-b880-e8fd08c61866" containerID="5aca3c8417be874abc1f2315a277aa01294ae0ed0124a89c74daa7fb09eed57f" exitCode=0 Mar 13 10:29:24 crc kubenswrapper[4841]: I0313 10:29:24.978926 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrv7t" event={"ID":"a8c8a4ca-c69a-4050-b880-e8fd08c61866","Type":"ContainerDied","Data":"5aca3c8417be874abc1f2315a277aa01294ae0ed0124a89c74daa7fb09eed57f"} Mar 13 10:29:24 crc kubenswrapper[4841]: I0313 10:29:24.979092 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrv7t" event={"ID":"a8c8a4ca-c69a-4050-b880-e8fd08c61866","Type":"ContainerStarted","Data":"90957bf088ecadc50e80cdc74aad6005706a84b89762f03b94eb65fe3d2ed020"} Mar 13 10:29:24 crc kubenswrapper[4841]: I0313 10:29:24.981022 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 10:29:25 crc kubenswrapper[4841]: I0313 10:29:25.862082 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rfl2f"] Mar 13 10:29:25 crc kubenswrapper[4841]: I0313 10:29:25.865169 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rfl2f" Mar 13 10:29:25 crc kubenswrapper[4841]: I0313 10:29:25.908976 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rfl2f"] Mar 13 10:29:25 crc kubenswrapper[4841]: I0313 10:29:25.919827 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d3f707-5363-4006-927c-a1317b59914f-utilities\") pod \"redhat-operators-rfl2f\" (UID: \"55d3f707-5363-4006-927c-a1317b59914f\") " pod="openshift-marketplace/redhat-operators-rfl2f" Mar 13 10:29:25 crc kubenswrapper[4841]: I0313 10:29:25.919934 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9w5b\" (UniqueName: \"kubernetes.io/projected/55d3f707-5363-4006-927c-a1317b59914f-kube-api-access-h9w5b\") pod \"redhat-operators-rfl2f\" (UID: \"55d3f707-5363-4006-927c-a1317b59914f\") " pod="openshift-marketplace/redhat-operators-rfl2f" Mar 13 10:29:25 crc kubenswrapper[4841]: I0313 10:29:25.920257 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d3f707-5363-4006-927c-a1317b59914f-catalog-content\") pod \"redhat-operators-rfl2f\" (UID: \"55d3f707-5363-4006-927c-a1317b59914f\") " pod="openshift-marketplace/redhat-operators-rfl2f" Mar 13 10:29:26 crc kubenswrapper[4841]: I0313 10:29:26.021853 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d3f707-5363-4006-927c-a1317b59914f-utilities\") pod \"redhat-operators-rfl2f\" (UID: \"55d3f707-5363-4006-927c-a1317b59914f\") " pod="openshift-marketplace/redhat-operators-rfl2f" Mar 13 10:29:26 crc kubenswrapper[4841]: I0313 10:29:26.021936 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9w5b\" (UniqueName: \"kubernetes.io/projected/55d3f707-5363-4006-927c-a1317b59914f-kube-api-access-h9w5b\") pod \"redhat-operators-rfl2f\" (UID: \"55d3f707-5363-4006-927c-a1317b59914f\") " pod="openshift-marketplace/redhat-operators-rfl2f" Mar 13 10:29:26 crc kubenswrapper[4841]: I0313 10:29:26.022110 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d3f707-5363-4006-927c-a1317b59914f-catalog-content\") pod \"redhat-operators-rfl2f\" (UID: \"55d3f707-5363-4006-927c-a1317b59914f\") " pod="openshift-marketplace/redhat-operators-rfl2f" Mar 13 10:29:26 crc kubenswrapper[4841]: I0313 10:29:26.022638 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d3f707-5363-4006-927c-a1317b59914f-catalog-content\") pod \"redhat-operators-rfl2f\" (UID: \"55d3f707-5363-4006-927c-a1317b59914f\") " pod="openshift-marketplace/redhat-operators-rfl2f" Mar 13 10:29:26 crc kubenswrapper[4841]: I0313 10:29:26.022701 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d3f707-5363-4006-927c-a1317b59914f-utilities\") pod \"redhat-operators-rfl2f\" (UID: \"55d3f707-5363-4006-927c-a1317b59914f\") " pod="openshift-marketplace/redhat-operators-rfl2f" Mar 13 10:29:26 crc kubenswrapper[4841]: I0313 10:29:26.047577 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9w5b\" (UniqueName: \"kubernetes.io/projected/55d3f707-5363-4006-927c-a1317b59914f-kube-api-access-h9w5b\") pod \"redhat-operators-rfl2f\" (UID: \"55d3f707-5363-4006-927c-a1317b59914f\") " pod="openshift-marketplace/redhat-operators-rfl2f" Mar 13 10:29:26 crc kubenswrapper[4841]: I0313 10:29:26.202330 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rfl2f" Mar 13 10:29:26 crc kubenswrapper[4841]: W0313 10:29:26.500600 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55d3f707_5363_4006_927c_a1317b59914f.slice/crio-c4e7b7111a806aa843402b2886e09a7b4aa835bb7ee7be0de0d5e2db43333d0c WatchSource:0}: Error finding container c4e7b7111a806aa843402b2886e09a7b4aa835bb7ee7be0de0d5e2db43333d0c: Status 404 returned error can't find the container with id c4e7b7111a806aa843402b2886e09a7b4aa835bb7ee7be0de0d5e2db43333d0c Mar 13 10:29:26 crc kubenswrapper[4841]: I0313 10:29:26.507628 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rfl2f"] Mar 13 10:29:26 crc kubenswrapper[4841]: I0313 10:29:26.998377 4841 generic.go:334] "Generic (PLEG): container finished" podID="55d3f707-5363-4006-927c-a1317b59914f" containerID="c920c2b04dbf15d2e2a07fb0594b1f836ce0b137837d3c7fe6af1ffa39234216" exitCode=0 Mar 13 10:29:26 crc kubenswrapper[4841]: I0313 10:29:26.998418 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfl2f" event={"ID":"55d3f707-5363-4006-927c-a1317b59914f","Type":"ContainerDied","Data":"c920c2b04dbf15d2e2a07fb0594b1f836ce0b137837d3c7fe6af1ffa39234216"} Mar 13 10:29:26 crc kubenswrapper[4841]: I0313 10:29:26.998441 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfl2f" event={"ID":"55d3f707-5363-4006-927c-a1317b59914f","Type":"ContainerStarted","Data":"c4e7b7111a806aa843402b2886e09a7b4aa835bb7ee7be0de0d5e2db43333d0c"} Mar 13 10:29:28 crc kubenswrapper[4841]: I0313 10:29:28.024862 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfl2f" event={"ID":"55d3f707-5363-4006-927c-a1317b59914f","Type":"ContainerStarted","Data":"dcdf33058480ad2dac7f91863c10a575eac1b141671e36a068bdd16f9d016b28"} Mar 13 10:29:31 crc kubenswrapper[4841]: I0313 10:29:31.067404 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrv7t" event={"ID":"a8c8a4ca-c69a-4050-b880-e8fd08c61866","Type":"ContainerStarted","Data":"f6f02812b22ba28f409639c83beacc9af83ac4e9b14dc398aeeb64c39445f7eb"} Mar 13 10:29:33 crc kubenswrapper[4841]: I0313 10:29:33.086776 4841 generic.go:334] "Generic (PLEG): container finished" podID="a8c8a4ca-c69a-4050-b880-e8fd08c61866" containerID="f6f02812b22ba28f409639c83beacc9af83ac4e9b14dc398aeeb64c39445f7eb" exitCode=0 Mar 13 10:29:33 crc kubenswrapper[4841]: I0313 10:29:33.086923 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrv7t" event={"ID":"a8c8a4ca-c69a-4050-b880-e8fd08c61866","Type":"ContainerDied","Data":"f6f02812b22ba28f409639c83beacc9af83ac4e9b14dc398aeeb64c39445f7eb"} Mar 13 10:29:34 crc kubenswrapper[4841]: I0313 10:29:34.103367 4841 generic.go:334] "Generic (PLEG): container finished" podID="55d3f707-5363-4006-927c-a1317b59914f" containerID="dcdf33058480ad2dac7f91863c10a575eac1b141671e36a068bdd16f9d016b28" exitCode=0 Mar 13 10:29:34 crc kubenswrapper[4841]: I0313 10:29:34.103460 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfl2f" event={"ID":"55d3f707-5363-4006-927c-a1317b59914f","Type":"ContainerDied","Data":"dcdf33058480ad2dac7f91863c10a575eac1b141671e36a068bdd16f9d016b28"} Mar 13 10:29:34 crc kubenswrapper[4841]: I0313 10:29:34.106613 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rrv7t" event={"ID":"a8c8a4ca-c69a-4050-b880-e8fd08c61866","Type":"ContainerStarted","Data":"e65ffebedb09b811e85c174d6d21119090905be09b86c33ee07fb32e2a1b491e"} Mar 13 10:29:34 crc kubenswrapper[4841]: I0313 10:29:34.149951 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rrv7t" podStartSLOduration=2.629943356 podStartE2EDuration="11.149933825s" podCreationTimestamp="2026-03-13 10:29:23 +0000 UTC" firstStartedPulling="2026-03-13 10:29:24.980729411 +0000 UTC m=+4647.710629602" lastFinishedPulling="2026-03-13 10:29:33.50071988 +0000 UTC m=+4656.230620071" observedRunningTime="2026-03-13 10:29:34.142518333 +0000 UTC m=+4656.872418524" watchObservedRunningTime="2026-03-13 10:29:34.149933825 +0000 UTC m=+4656.879834016" Mar 13 10:29:36 crc kubenswrapper[4841]: I0313 10:29:36.127574 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfl2f" event={"ID":"55d3f707-5363-4006-927c-a1317b59914f","Type":"ContainerStarted","Data":"b052d13e57fcecc74389700852d2e4eed94c14601715a0adced8c33c9935de29"} Mar 13 10:29:36 crc kubenswrapper[4841]: I0313 10:29:36.159893 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rfl2f" podStartSLOduration=3.157522342 podStartE2EDuration="11.15987512s" podCreationTimestamp="2026-03-13 10:29:25 +0000 UTC" firstStartedPulling="2026-03-13 10:29:27.002178747 +0000 UTC m=+4649.732078938" lastFinishedPulling="2026-03-13 10:29:35.004531515 +0000 UTC m=+4657.734431716" observedRunningTime="2026-03-13 10:29:36.151848008 +0000 UTC m=+4658.881748199" watchObservedRunningTime="2026-03-13 10:29:36.15987512 +0000 UTC m=+4658.889775311" Mar 13 10:29:36 crc kubenswrapper[4841]: I0313 10:29:36.203055 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rfl2f" Mar 13 10:29:36 crc kubenswrapper[4841]: I0313 10:29:36.203125 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rfl2f" Mar 13 10:29:37 crc kubenswrapper[4841]: I0313 10:29:37.248032 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rfl2f" podUID="55d3f707-5363-4006-927c-a1317b59914f" containerName="registry-server" probeResult="failure" output=< Mar 13 10:29:37 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Mar 13 10:29:37 crc kubenswrapper[4841]: > Mar 13 10:29:38 crc kubenswrapper[4841]: I0313 10:29:38.001202 4841 scope.go:117] "RemoveContainer" containerID="97b2543947995b5a4bd4d82bd1ec4b1b0c34ff8a855d760224d5b9b3fbc817f1" Mar 13 10:29:39 crc kubenswrapper[4841]: I0313 10:29:39.281548 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-h227v" event={"ID":"e49b836b-f6cf-4cee-b1be-6bd7864fb7f2","Type":"ContainerStarted","Data":"842efddd0b5b48d0d966404c835817ec112c538bdfeacbf44f4438713550f942"} Mar 13 10:29:43 crc kubenswrapper[4841]: I0313 10:29:43.587661 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rrv7t" Mar 13 10:29:43 crc kubenswrapper[4841]: I0313 10:29:43.588113 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rrv7t" Mar 13 10:29:43 crc kubenswrapper[4841]: I0313 10:29:43.644706 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rrv7t" Mar 13 10:29:44 crc kubenswrapper[4841]: I0313 10:29:44.407318 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rrv7t" Mar 13 10:29:45 crc kubenswrapper[4841]: I0313 10:29:45.474528 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rrv7t"] Mar 13 10:29:45 crc kubenswrapper[4841]: I0313 10:29:45.659407 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-szx4f"] Mar 13 10:29:45 crc kubenswrapper[4841]: I0313 10:29:45.659637 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-szx4f" podUID="eb0bb4de-58c3-4d6e-a22d-735e9346f228" containerName="registry-server" containerID="cri-o://00fe213aa62b55436a947788531e56f05fa6dd546dc7b5c1fe7d2b1125134c0c" gracePeriod=2 Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.175674 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szx4f" Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.282891 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d58lh\" (UniqueName: \"kubernetes.io/projected/eb0bb4de-58c3-4d6e-a22d-735e9346f228-kube-api-access-d58lh\") pod \"eb0bb4de-58c3-4d6e-a22d-735e9346f228\" (UID: \"eb0bb4de-58c3-4d6e-a22d-735e9346f228\") " Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.283143 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb0bb4de-58c3-4d6e-a22d-735e9346f228-utilities\") pod \"eb0bb4de-58c3-4d6e-a22d-735e9346f228\" (UID: \"eb0bb4de-58c3-4d6e-a22d-735e9346f228\") " Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.283179 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb0bb4de-58c3-4d6e-a22d-735e9346f228-catalog-content\") pod \"eb0bb4de-58c3-4d6e-a22d-735e9346f228\" (UID: \"eb0bb4de-58c3-4d6e-a22d-735e9346f228\") " Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.286016 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb0bb4de-58c3-4d6e-a22d-735e9346f228-utilities" (OuterVolumeSpecName: "utilities") pod "eb0bb4de-58c3-4d6e-a22d-735e9346f228" (UID: "eb0bb4de-58c3-4d6e-a22d-735e9346f228"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.303252 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb0bb4de-58c3-4d6e-a22d-735e9346f228-kube-api-access-d58lh" (OuterVolumeSpecName: "kube-api-access-d58lh") pod "eb0bb4de-58c3-4d6e-a22d-735e9346f228" (UID: "eb0bb4de-58c3-4d6e-a22d-735e9346f228"). InnerVolumeSpecName "kube-api-access-d58lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.361713 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb0bb4de-58c3-4d6e-a22d-735e9346f228-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb0bb4de-58c3-4d6e-a22d-735e9346f228" (UID: "eb0bb4de-58c3-4d6e-a22d-735e9346f228"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.375810 4841 generic.go:334] "Generic (PLEG): container finished" podID="eb0bb4de-58c3-4d6e-a22d-735e9346f228" containerID="00fe213aa62b55436a947788531e56f05fa6dd546dc7b5c1fe7d2b1125134c0c" exitCode=0 Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.376767 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szx4f" Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.377007 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szx4f" event={"ID":"eb0bb4de-58c3-4d6e-a22d-735e9346f228","Type":"ContainerDied","Data":"00fe213aa62b55436a947788531e56f05fa6dd546dc7b5c1fe7d2b1125134c0c"} Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.377069 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szx4f" event={"ID":"eb0bb4de-58c3-4d6e-a22d-735e9346f228","Type":"ContainerDied","Data":"ad8a7a3bf11d07a9d1a1980319066539a9c6b69541fb774d01d829563d7558d1"} Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.377089 4841 scope.go:117] "RemoveContainer" containerID="00fe213aa62b55436a947788531e56f05fa6dd546dc7b5c1fe7d2b1125134c0c" Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.385115 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb0bb4de-58c3-4d6e-a22d-735e9346f228-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.385149 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb0bb4de-58c3-4d6e-a22d-735e9346f228-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.385162 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d58lh\" (UniqueName: \"kubernetes.io/projected/eb0bb4de-58c3-4d6e-a22d-735e9346f228-kube-api-access-d58lh\") on node \"crc\" DevicePath \"\"" Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.400537 4841 scope.go:117] "RemoveContainer" containerID="9d257d9a5d34d469cbe7efda8689038a1fdec0cccdf59561b3d81d79619f26bf" Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.418438 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-szx4f"] Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.429876 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-szx4f"] Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.440905 4841 scope.go:117] "RemoveContainer" containerID="d1eca9c57821c0287578b519933f0c25cfdc913044c7867265ea77f670ea12f9" Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.475082 4841 scope.go:117] "RemoveContainer" containerID="00fe213aa62b55436a947788531e56f05fa6dd546dc7b5c1fe7d2b1125134c0c" Mar 13 10:29:46 crc kubenswrapper[4841]: E0313 10:29:46.475627 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00fe213aa62b55436a947788531e56f05fa6dd546dc7b5c1fe7d2b1125134c0c\": container with ID starting with 00fe213aa62b55436a947788531e56f05fa6dd546dc7b5c1fe7d2b1125134c0c not found: ID does not exist" containerID="00fe213aa62b55436a947788531e56f05fa6dd546dc7b5c1fe7d2b1125134c0c" Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.475667 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00fe213aa62b55436a947788531e56f05fa6dd546dc7b5c1fe7d2b1125134c0c"} err="failed to get container status \"00fe213aa62b55436a947788531e56f05fa6dd546dc7b5c1fe7d2b1125134c0c\": rpc error: code = NotFound desc = could not find container \"00fe213aa62b55436a947788531e56f05fa6dd546dc7b5c1fe7d2b1125134c0c\": container with ID starting with 00fe213aa62b55436a947788531e56f05fa6dd546dc7b5c1fe7d2b1125134c0c not found: ID does not exist" Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.475692 4841 scope.go:117] "RemoveContainer" containerID="9d257d9a5d34d469cbe7efda8689038a1fdec0cccdf59561b3d81d79619f26bf" Mar 13 10:29:46 crc kubenswrapper[4841]: E0313 10:29:46.476026 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d257d9a5d34d469cbe7efda8689038a1fdec0cccdf59561b3d81d79619f26bf\": container with ID starting with 9d257d9a5d34d469cbe7efda8689038a1fdec0cccdf59561b3d81d79619f26bf not found: ID does not exist" containerID="9d257d9a5d34d469cbe7efda8689038a1fdec0cccdf59561b3d81d79619f26bf" Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.476046 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d257d9a5d34d469cbe7efda8689038a1fdec0cccdf59561b3d81d79619f26bf"} err="failed to get container status \"9d257d9a5d34d469cbe7efda8689038a1fdec0cccdf59561b3d81d79619f26bf\": rpc error: code = NotFound desc = could not find container \"9d257d9a5d34d469cbe7efda8689038a1fdec0cccdf59561b3d81d79619f26bf\": container with ID starting with 9d257d9a5d34d469cbe7efda8689038a1fdec0cccdf59561b3d81d79619f26bf not found: ID does not exist" Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.476062 4841 scope.go:117] "RemoveContainer" containerID="d1eca9c57821c0287578b519933f0c25cfdc913044c7867265ea77f670ea12f9" Mar 13 10:29:46 crc kubenswrapper[4841]: E0313 10:29:46.476292 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1eca9c57821c0287578b519933f0c25cfdc913044c7867265ea77f670ea12f9\": container with ID starting with d1eca9c57821c0287578b519933f0c25cfdc913044c7867265ea77f670ea12f9 not found: ID does not exist" containerID="d1eca9c57821c0287578b519933f0c25cfdc913044c7867265ea77f670ea12f9" Mar 13 10:29:46 crc kubenswrapper[4841]: I0313 10:29:46.476312 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1eca9c57821c0287578b519933f0c25cfdc913044c7867265ea77f670ea12f9"} err="failed to get container status \"d1eca9c57821c0287578b519933f0c25cfdc913044c7867265ea77f670ea12f9\": rpc error: code = NotFound desc = could not find container \"d1eca9c57821c0287578b519933f0c25cfdc913044c7867265ea77f670ea12f9\": container with ID starting with d1eca9c57821c0287578b519933f0c25cfdc913044c7867265ea77f670ea12f9 not found: ID does not exist" Mar 13 10:29:47 crc kubenswrapper[4841]: I0313 10:29:47.282852 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rfl2f" podUID="55d3f707-5363-4006-927c-a1317b59914f" containerName="registry-server" probeResult="failure" output=< Mar 13 10:29:47 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Mar 13 10:29:47 crc kubenswrapper[4841]: > Mar 13 10:29:48 crc kubenswrapper[4841]: I0313 10:29:48.027389 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb0bb4de-58c3-4d6e-a22d-735e9346f228" path="/var/lib/kubelet/pods/eb0bb4de-58c3-4d6e-a22d-735e9346f228/volumes" Mar 13 10:29:56 crc kubenswrapper[4841]: I0313 10:29:56.247494 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rfl2f" Mar 13 10:29:56 crc kubenswrapper[4841]: I0313 10:29:56.299986 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rfl2f" Mar 13 10:29:57 crc kubenswrapper[4841]: I0313 10:29:57.053790 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rfl2f"] Mar 13 10:29:57 crc kubenswrapper[4841]: I0313 10:29:57.477978 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rfl2f" podUID="55d3f707-5363-4006-927c-a1317b59914f" containerName="registry-server" containerID="cri-o://b052d13e57fcecc74389700852d2e4eed94c14601715a0adced8c33c9935de29" gracePeriod=2 Mar 13 10:29:58 crc kubenswrapper[4841]: I0313 10:29:58.490627 4841 generic.go:334] "Generic (PLEG): container finished" podID="55d3f707-5363-4006-927c-a1317b59914f" containerID="b052d13e57fcecc74389700852d2e4eed94c14601715a0adced8c33c9935de29" exitCode=0 Mar 13 10:29:58 crc kubenswrapper[4841]: I0313 10:29:58.490935 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfl2f" event={"ID":"55d3f707-5363-4006-927c-a1317b59914f","Type":"ContainerDied","Data":"b052d13e57fcecc74389700852d2e4eed94c14601715a0adced8c33c9935de29"} Mar 13 10:29:58 crc kubenswrapper[4841]: I0313 10:29:58.490989 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rfl2f" event={"ID":"55d3f707-5363-4006-927c-a1317b59914f","Type":"ContainerDied","Data":"c4e7b7111a806aa843402b2886e09a7b4aa835bb7ee7be0de0d5e2db43333d0c"} Mar 13 10:29:58 crc kubenswrapper[4841]: I0313 10:29:58.491005 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4e7b7111a806aa843402b2886e09a7b4aa835bb7ee7be0de0d5e2db43333d0c" Mar 13 10:29:59 crc kubenswrapper[4841]: I0313 10:29:59.057463 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rfl2f" Mar 13 10:29:59 crc kubenswrapper[4841]: I0313 10:29:59.138781 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d3f707-5363-4006-927c-a1317b59914f-utilities\") pod \"55d3f707-5363-4006-927c-a1317b59914f\" (UID: \"55d3f707-5363-4006-927c-a1317b59914f\") " Mar 13 10:29:59 crc kubenswrapper[4841]: I0313 10:29:59.139110 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9w5b\" (UniqueName: \"kubernetes.io/projected/55d3f707-5363-4006-927c-a1317b59914f-kube-api-access-h9w5b\") pod \"55d3f707-5363-4006-927c-a1317b59914f\" (UID: \"55d3f707-5363-4006-927c-a1317b59914f\") " Mar 13 10:29:59 crc kubenswrapper[4841]: I0313 10:29:59.139398 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d3f707-5363-4006-927c-a1317b59914f-catalog-content\") pod \"55d3f707-5363-4006-927c-a1317b59914f\" (UID: \"55d3f707-5363-4006-927c-a1317b59914f\") " Mar 13 10:29:59 crc kubenswrapper[4841]: I0313 10:29:59.139482 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55d3f707-5363-4006-927c-a1317b59914f-utilities" (OuterVolumeSpecName: "utilities") pod "55d3f707-5363-4006-927c-a1317b59914f" (UID: "55d3f707-5363-4006-927c-a1317b59914f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:29:59 crc kubenswrapper[4841]: I0313 10:29:59.140090 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d3f707-5363-4006-927c-a1317b59914f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 10:29:59 crc kubenswrapper[4841]: I0313 10:29:59.144114 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55d3f707-5363-4006-927c-a1317b59914f-kube-api-access-h9w5b" (OuterVolumeSpecName: "kube-api-access-h9w5b") pod "55d3f707-5363-4006-927c-a1317b59914f" (UID: "55d3f707-5363-4006-927c-a1317b59914f"). InnerVolumeSpecName "kube-api-access-h9w5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:29:59 crc kubenswrapper[4841]: I0313 10:29:59.241776 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9w5b\" (UniqueName: \"kubernetes.io/projected/55d3f707-5363-4006-927c-a1317b59914f-kube-api-access-h9w5b\") on node \"crc\" DevicePath \"\"" Mar 13 10:29:59 crc kubenswrapper[4841]: I0313 10:29:59.263512 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55d3f707-5363-4006-927c-a1317b59914f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55d3f707-5363-4006-927c-a1317b59914f" (UID: "55d3f707-5363-4006-927c-a1317b59914f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:29:59 crc kubenswrapper[4841]: I0313 10:29:59.343710 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d3f707-5363-4006-927c-a1317b59914f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 10:29:59 crc kubenswrapper[4841]: I0313 10:29:59.499524 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rfl2f" Mar 13 10:29:59 crc kubenswrapper[4841]: I0313 10:29:59.532848 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rfl2f"] Mar 13 10:29:59 crc kubenswrapper[4841]: I0313 10:29:59.541568 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rfl2f"] Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.006835 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55d3f707-5363-4006-927c-a1317b59914f" path="/var/lib/kubelet/pods/55d3f707-5363-4006-927c-a1317b59914f/volumes" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.149620 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556630-rgfvl"] Mar 13 10:30:00 crc kubenswrapper[4841]: E0313 10:30:00.150025 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0bb4de-58c3-4d6e-a22d-735e9346f228" containerName="extract-content" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.150042 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0bb4de-58c3-4d6e-a22d-735e9346f228" containerName="extract-content" Mar 13 10:30:00 crc kubenswrapper[4841]: E0313 10:30:00.150058 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d3f707-5363-4006-927c-a1317b59914f" containerName="registry-server" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.150065 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d3f707-5363-4006-927c-a1317b59914f" containerName="registry-server" Mar 13 10:30:00 crc kubenswrapper[4841]: E0313 10:30:00.150074 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0bb4de-58c3-4d6e-a22d-735e9346f228" containerName="extract-utilities" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.150080 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0bb4de-58c3-4d6e-a22d-735e9346f228" containerName="extract-utilities" Mar 13 10:30:00 crc kubenswrapper[4841]: E0313 10:30:00.150103 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0bb4de-58c3-4d6e-a22d-735e9346f228" containerName="registry-server" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.150109 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0bb4de-58c3-4d6e-a22d-735e9346f228" containerName="registry-server" Mar 13 10:30:00 crc kubenswrapper[4841]: E0313 10:30:00.150119 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d3f707-5363-4006-927c-a1317b59914f" containerName="extract-utilities" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.150125 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d3f707-5363-4006-927c-a1317b59914f" containerName="extract-utilities" Mar 13 10:30:00 crc kubenswrapper[4841]: E0313 10:30:00.150147 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d3f707-5363-4006-927c-a1317b59914f" containerName="extract-content" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.150152 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d3f707-5363-4006-927c-a1317b59914f" containerName="extract-content" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.150378 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb0bb4de-58c3-4d6e-a22d-735e9346f228" containerName="registry-server" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.150393 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="55d3f707-5363-4006-927c-a1317b59914f" containerName="registry-server" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.151063 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556630-rgfvl" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.155188 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.155443 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.160153 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556630-rgfvl"] Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.161681 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.250477 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl"] Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.252090 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.256741 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.256755 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.259207 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhpgq\" (UniqueName: \"kubernetes.io/projected/cebc6ee1-a46a-499d-8bc3-816777c3b848-kube-api-access-rhpgq\") pod \"auto-csr-approver-29556630-rgfvl\" (UID: \"cebc6ee1-a46a-499d-8bc3-816777c3b848\") " pod="openshift-infra/auto-csr-approver-29556630-rgfvl" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.262670 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl"] Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.361121 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhpgq\" (UniqueName: \"kubernetes.io/projected/cebc6ee1-a46a-499d-8bc3-816777c3b848-kube-api-access-rhpgq\") pod \"auto-csr-approver-29556630-rgfvl\" (UID: \"cebc6ee1-a46a-499d-8bc3-816777c3b848\") " pod="openshift-infra/auto-csr-approver-29556630-rgfvl" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.361744 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74933d7c-b49e-4179-8dde-97586e8d7017-secret-volume\") pod \"collect-profiles-29556630-cr5bl\" (UID: \"74933d7c-b49e-4179-8dde-97586e8d7017\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.361865 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cmq9\" (UniqueName: \"kubernetes.io/projected/74933d7c-b49e-4179-8dde-97586e8d7017-kube-api-access-4cmq9\") pod \"collect-profiles-29556630-cr5bl\" (UID: \"74933d7c-b49e-4179-8dde-97586e8d7017\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.362051 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74933d7c-b49e-4179-8dde-97586e8d7017-config-volume\") pod \"collect-profiles-29556630-cr5bl\" (UID: \"74933d7c-b49e-4179-8dde-97586e8d7017\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.463861 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74933d7c-b49e-4179-8dde-97586e8d7017-secret-volume\") pod \"collect-profiles-29556630-cr5bl\" (UID: \"74933d7c-b49e-4179-8dde-97586e8d7017\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.464094 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cmq9\" (UniqueName: \"kubernetes.io/projected/74933d7c-b49e-4179-8dde-97586e8d7017-kube-api-access-4cmq9\") pod \"collect-profiles-29556630-cr5bl\" (UID: \"74933d7c-b49e-4179-8dde-97586e8d7017\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.464245 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74933d7c-b49e-4179-8dde-97586e8d7017-config-volume\") pod \"collect-profiles-29556630-cr5bl\" (UID: \"74933d7c-b49e-4179-8dde-97586e8d7017\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.465106 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74933d7c-b49e-4179-8dde-97586e8d7017-config-volume\") pod \"collect-profiles-29556630-cr5bl\" (UID: \"74933d7c-b49e-4179-8dde-97586e8d7017\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.794668 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhpgq\" (UniqueName: \"kubernetes.io/projected/cebc6ee1-a46a-499d-8bc3-816777c3b848-kube-api-access-rhpgq\") pod \"auto-csr-approver-29556630-rgfvl\" (UID: \"cebc6ee1-a46a-499d-8bc3-816777c3b848\") " pod="openshift-infra/auto-csr-approver-29556630-rgfvl" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.795330 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cmq9\" (UniqueName: \"kubernetes.io/projected/74933d7c-b49e-4179-8dde-97586e8d7017-kube-api-access-4cmq9\") pod \"collect-profiles-29556630-cr5bl\" (UID: \"74933d7c-b49e-4179-8dde-97586e8d7017\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.803108 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74933d7c-b49e-4179-8dde-97586e8d7017-secret-volume\") pod \"collect-profiles-29556630-cr5bl\" (UID: \"74933d7c-b49e-4179-8dde-97586e8d7017\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl" Mar 13 10:30:00 crc kubenswrapper[4841]: I0313 10:30:00.870889 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl" Mar 13 10:30:01 crc kubenswrapper[4841]: I0313 10:30:01.071658 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556630-rgfvl" Mar 13 10:30:01 crc kubenswrapper[4841]: I0313 10:30:01.317574 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl"] Mar 13 10:30:01 crc kubenswrapper[4841]: I0313 10:30:01.518063 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl" event={"ID":"74933d7c-b49e-4179-8dde-97586e8d7017","Type":"ContainerStarted","Data":"d0ee5c041931558d208abd564a839013c20f6f1226aacdf9be4c3b649fe7cb22"} Mar 13 10:30:01 crc kubenswrapper[4841]: I0313 10:30:01.518117 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl" event={"ID":"74933d7c-b49e-4179-8dde-97586e8d7017","Type":"ContainerStarted","Data":"bde7314ee32f27ed975d5d2bebda84ae1abb8011857b8ef752edaaec9b2a4376"} Mar 13 10:30:01 crc kubenswrapper[4841]: W0313 10:30:01.519279 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcebc6ee1_a46a_499d_8bc3_816777c3b848.slice/crio-791c40030c3e13046828ef2e3009c2ac08603c08b4c1737626c0462da73ed158 WatchSource:0}: Error finding container 791c40030c3e13046828ef2e3009c2ac08603c08b4c1737626c0462da73ed158: Status 404 returned error can't find the container with id 791c40030c3e13046828ef2e3009c2ac08603c08b4c1737626c0462da73ed158 Mar 13 10:30:01 crc kubenswrapper[4841]: I0313 10:30:01.522129 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556630-rgfvl"] Mar 13 10:30:02 crc kubenswrapper[4841]: I0313 10:30:02.528745 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556630-rgfvl" event={"ID":"cebc6ee1-a46a-499d-8bc3-816777c3b848","Type":"ContainerStarted","Data":"791c40030c3e13046828ef2e3009c2ac08603c08b4c1737626c0462da73ed158"} Mar 13 10:30:02 crc kubenswrapper[4841]: I0313 10:30:02.533804 4841 generic.go:334] "Generic (PLEG): container finished" podID="74933d7c-b49e-4179-8dde-97586e8d7017" containerID="d0ee5c041931558d208abd564a839013c20f6f1226aacdf9be4c3b649fe7cb22" exitCode=0 Mar 13 10:30:02 crc kubenswrapper[4841]: I0313 10:30:02.533864 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl" event={"ID":"74933d7c-b49e-4179-8dde-97586e8d7017","Type":"ContainerDied","Data":"d0ee5c041931558d208abd564a839013c20f6f1226aacdf9be4c3b649fe7cb22"} Mar 13 10:30:03 crc kubenswrapper[4841]: I0313 10:30:03.948620 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl" Mar 13 10:30:04 crc kubenswrapper[4841]: I0313 10:30:04.032721 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cmq9\" (UniqueName: \"kubernetes.io/projected/74933d7c-b49e-4179-8dde-97586e8d7017-kube-api-access-4cmq9\") pod \"74933d7c-b49e-4179-8dde-97586e8d7017\" (UID: \"74933d7c-b49e-4179-8dde-97586e8d7017\") " Mar 13 10:30:04 crc kubenswrapper[4841]: I0313 10:30:04.033463 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74933d7c-b49e-4179-8dde-97586e8d7017-config-volume\") pod \"74933d7c-b49e-4179-8dde-97586e8d7017\" (UID: \"74933d7c-b49e-4179-8dde-97586e8d7017\") " Mar 13 10:30:04 crc kubenswrapper[4841]: I0313 10:30:04.033558 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74933d7c-b49e-4179-8dde-97586e8d7017-secret-volume\") pod \"74933d7c-b49e-4179-8dde-97586e8d7017\" (UID: \"74933d7c-b49e-4179-8dde-97586e8d7017\") " Mar 13 10:30:04 crc kubenswrapper[4841]: I0313 10:30:04.033904 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74933d7c-b49e-4179-8dde-97586e8d7017-config-volume" (OuterVolumeSpecName: "config-volume") pod "74933d7c-b49e-4179-8dde-97586e8d7017" (UID: "74933d7c-b49e-4179-8dde-97586e8d7017"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:30:04 crc kubenswrapper[4841]: I0313 10:30:04.034062 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74933d7c-b49e-4179-8dde-97586e8d7017-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 10:30:04 crc kubenswrapper[4841]: I0313 10:30:04.039646 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74933d7c-b49e-4179-8dde-97586e8d7017-kube-api-access-4cmq9" (OuterVolumeSpecName: "kube-api-access-4cmq9") pod "74933d7c-b49e-4179-8dde-97586e8d7017" (UID: "74933d7c-b49e-4179-8dde-97586e8d7017"). InnerVolumeSpecName "kube-api-access-4cmq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:30:04 crc kubenswrapper[4841]: I0313 10:30:04.044402 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74933d7c-b49e-4179-8dde-97586e8d7017-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "74933d7c-b49e-4179-8dde-97586e8d7017" (UID: "74933d7c-b49e-4179-8dde-97586e8d7017"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:30:04 crc kubenswrapper[4841]: I0313 10:30:04.135525 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cmq9\" (UniqueName: \"kubernetes.io/projected/74933d7c-b49e-4179-8dde-97586e8d7017-kube-api-access-4cmq9\") on node \"crc\" DevicePath \"\"" Mar 13 10:30:04 crc kubenswrapper[4841]: I0313 10:30:04.135559 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74933d7c-b49e-4179-8dde-97586e8d7017-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 10:30:04 crc kubenswrapper[4841]: I0313 10:30:04.389213 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th"] Mar 13 10:30:04 crc kubenswrapper[4841]: I0313 10:30:04.402614 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556585-b64th"] Mar 13 10:30:04 crc kubenswrapper[4841]: I0313 10:30:04.567589 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556630-rgfvl" event={"ID":"cebc6ee1-a46a-499d-8bc3-816777c3b848","Type":"ContainerStarted","Data":"154d7619959dfe09cb92b96f84ce4399eebc9cd69567e975e0a96269fa32c7c2"} Mar 13 10:30:04 crc kubenswrapper[4841]: I0313 10:30:04.569238 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl" event={"ID":"74933d7c-b49e-4179-8dde-97586e8d7017","Type":"ContainerDied","Data":"bde7314ee32f27ed975d5d2bebda84ae1abb8011857b8ef752edaaec9b2a4376"} Mar 13 10:30:04 crc kubenswrapper[4841]: I0313 10:30:04.569318 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556630-cr5bl" Mar 13 10:30:04 crc kubenswrapper[4841]: I0313 10:30:04.569322 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bde7314ee32f27ed975d5d2bebda84ae1abb8011857b8ef752edaaec9b2a4376" Mar 13 10:30:04 crc kubenswrapper[4841]: I0313 10:30:04.590084 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556630-rgfvl" podStartSLOduration=1.944720883 podStartE2EDuration="4.59006704s" podCreationTimestamp="2026-03-13 10:30:00 +0000 UTC" firstStartedPulling="2026-03-13 10:30:01.521727958 +0000 UTC m=+4684.251628139" lastFinishedPulling="2026-03-13 10:30:04.167074105 +0000 UTC m=+4686.896974296" observedRunningTime="2026-03-13 10:30:04.588817262 +0000 UTC m=+4687.318717453" watchObservedRunningTime="2026-03-13 10:30:04.59006704 +0000 UTC m=+4687.319967231" Mar 13 10:30:05 crc kubenswrapper[4841]: I0313 10:30:05.579353 4841 generic.go:334] "Generic (PLEG): container finished" podID="cebc6ee1-a46a-499d-8bc3-816777c3b848" containerID="154d7619959dfe09cb92b96f84ce4399eebc9cd69567e975e0a96269fa32c7c2" exitCode=0 Mar 13 10:30:05 crc kubenswrapper[4841]: I0313 10:30:05.579424 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556630-rgfvl" event={"ID":"cebc6ee1-a46a-499d-8bc3-816777c3b848","Type":"ContainerDied","Data":"154d7619959dfe09cb92b96f84ce4399eebc9cd69567e975e0a96269fa32c7c2"} Mar 13 10:30:06 crc kubenswrapper[4841]: I0313 10:30:06.005939 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68684695-4cc8-4cf4-8c9c-ef7502600c1e" path="/var/lib/kubelet/pods/68684695-4cc8-4cf4-8c9c-ef7502600c1e/volumes" Mar 13 10:30:06 crc kubenswrapper[4841]: I0313 10:30:06.926060 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556630-rgfvl" Mar 13 10:30:07 crc kubenswrapper[4841]: I0313 10:30:07.003410 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhpgq\" (UniqueName: \"kubernetes.io/projected/cebc6ee1-a46a-499d-8bc3-816777c3b848-kube-api-access-rhpgq\") pod \"cebc6ee1-a46a-499d-8bc3-816777c3b848\" (UID: \"cebc6ee1-a46a-499d-8bc3-816777c3b848\") " Mar 13 10:30:07 crc kubenswrapper[4841]: I0313 10:30:07.009831 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cebc6ee1-a46a-499d-8bc3-816777c3b848-kube-api-access-rhpgq" (OuterVolumeSpecName: "kube-api-access-rhpgq") pod "cebc6ee1-a46a-499d-8bc3-816777c3b848" (UID: "cebc6ee1-a46a-499d-8bc3-816777c3b848"). InnerVolumeSpecName "kube-api-access-rhpgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:30:07 crc kubenswrapper[4841]: I0313 10:30:07.106146 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhpgq\" (UniqueName: \"kubernetes.io/projected/cebc6ee1-a46a-499d-8bc3-816777c3b848-kube-api-access-rhpgq\") on node \"crc\" DevicePath \"\"" Mar 13 10:30:07 crc kubenswrapper[4841]: I0313 10:30:07.603848 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556630-rgfvl" event={"ID":"cebc6ee1-a46a-499d-8bc3-816777c3b848","Type":"ContainerDied","Data":"791c40030c3e13046828ef2e3009c2ac08603c08b4c1737626c0462da73ed158"} Mar 13 10:30:07 crc kubenswrapper[4841]: I0313 10:30:07.603898 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556630-rgfvl" Mar 13 10:30:07 crc kubenswrapper[4841]: I0313 10:30:07.603906 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="791c40030c3e13046828ef2e3009c2ac08603c08b4c1737626c0462da73ed158" Mar 13 10:30:07 crc kubenswrapper[4841]: I0313 10:30:07.645768 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556624-5x2jt"] Mar 13 10:30:07 crc kubenswrapper[4841]: I0313 10:30:07.655121 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556624-5x2jt"] Mar 13 10:30:07 crc kubenswrapper[4841]: I0313 10:30:07.837361 4841 scope.go:117] "RemoveContainer" containerID="3a791bc73d694965e9885725672be0c75b9baaf7a7e3487ca1bb13ae33508f68" Mar 13 10:30:08 crc kubenswrapper[4841]: I0313 10:30:08.011013 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e65a6577-0c06-4d17-8399-8847b7c795cc" path="/var/lib/kubelet/pods/e65a6577-0c06-4d17-8399-8847b7c795cc/volumes" Mar 13 10:31:07 crc kubenswrapper[4841]: I0313 10:31:07.904509 4841 scope.go:117] "RemoveContainer" containerID="c1690b1c1975618d5688398180eb92ef3c17d06179cd7f59a8fa40daed01fc7c" Mar 13 10:32:00 crc kubenswrapper[4841]: I0313 10:32:00.154599 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556632-wqrbp"] Mar 13 10:32:00 crc kubenswrapper[4841]: E0313 10:32:00.155896 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74933d7c-b49e-4179-8dde-97586e8d7017" containerName="collect-profiles" Mar 13 10:32:00 crc kubenswrapper[4841]: I0313 10:32:00.156037 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="74933d7c-b49e-4179-8dde-97586e8d7017" containerName="collect-profiles" Mar 13 10:32:00 crc kubenswrapper[4841]: E0313 10:32:00.156074 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cebc6ee1-a46a-499d-8bc3-816777c3b848" containerName="oc" Mar 13 10:32:00 crc kubenswrapper[4841]: I0313 10:32:00.156107 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cebc6ee1-a46a-499d-8bc3-816777c3b848" containerName="oc" Mar 13 10:32:00 crc kubenswrapper[4841]: I0313 10:32:00.156607 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cebc6ee1-a46a-499d-8bc3-816777c3b848" containerName="oc" Mar 13 10:32:00 crc kubenswrapper[4841]: I0313 10:32:00.156667 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="74933d7c-b49e-4179-8dde-97586e8d7017" containerName="collect-profiles" Mar 13 10:32:00 crc kubenswrapper[4841]: I0313 10:32:00.160440 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556632-wqrbp" Mar 13 10:32:00 crc kubenswrapper[4841]: I0313 10:32:00.162685 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 10:32:00 crc kubenswrapper[4841]: I0313 10:32:00.162799 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-pbqn7" Mar 13 10:32:00 crc kubenswrapper[4841]: I0313 10:32:00.163826 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 10:32:00 crc kubenswrapper[4841]: I0313 10:32:00.170672 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556632-wqrbp"] Mar 13 10:32:00 crc kubenswrapper[4841]: I0313 10:32:00.235543 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltrpz\" (UniqueName: \"kubernetes.io/projected/c53bd3dc-7e14-4279-a826-0824fc7a9c43-kube-api-access-ltrpz\") pod \"auto-csr-approver-29556632-wqrbp\" (UID: \"c53bd3dc-7e14-4279-a826-0824fc7a9c43\") " pod="openshift-infra/auto-csr-approver-29556632-wqrbp" Mar 13 10:32:00 crc kubenswrapper[4841]: I0313 10:32:00.337605 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltrpz\" (UniqueName: \"kubernetes.io/projected/c53bd3dc-7e14-4279-a826-0824fc7a9c43-kube-api-access-ltrpz\") pod \"auto-csr-approver-29556632-wqrbp\" (UID: \"c53bd3dc-7e14-4279-a826-0824fc7a9c43\") " pod="openshift-infra/auto-csr-approver-29556632-wqrbp" Mar 13 10:32:00 crc kubenswrapper[4841]: I0313 10:32:00.361814 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltrpz\" (UniqueName: \"kubernetes.io/projected/c53bd3dc-7e14-4279-a826-0824fc7a9c43-kube-api-access-ltrpz\") pod \"auto-csr-approver-29556632-wqrbp\" (UID: \"c53bd3dc-7e14-4279-a826-0824fc7a9c43\") " pod="openshift-infra/auto-csr-approver-29556632-wqrbp" Mar 13 10:32:00 crc kubenswrapper[4841]: I0313 10:32:00.480828 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556632-wqrbp" Mar 13 10:32:00 crc kubenswrapper[4841]: I0313 10:32:00.949314 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556632-wqrbp"] Mar 13 10:32:01 crc kubenswrapper[4841]: I0313 10:32:01.663972 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556632-wqrbp" event={"ID":"c53bd3dc-7e14-4279-a826-0824fc7a9c43","Type":"ContainerStarted","Data":"1b980a6451b9116b16e2ba797f5a1e1e5089d6632f668c14c69d511de5da98ae"} Mar 13 10:32:02 crc kubenswrapper[4841]: I0313 10:32:02.675867 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556632-wqrbp" event={"ID":"c53bd3dc-7e14-4279-a826-0824fc7a9c43","Type":"ContainerStarted","Data":"e55cea03d5c6e85403404431d9fc6a1e27bb4e39f3cd33c1376e24ebb62abe1c"} Mar 13 10:32:02 crc kubenswrapper[4841]: I0313 10:32:02.699668 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556632-wqrbp" podStartSLOduration=1.353961055 podStartE2EDuration="2.699643054s" podCreationTimestamp="2026-03-13 10:32:00 +0000 UTC" firstStartedPulling="2026-03-13 10:32:00.957400147 +0000 UTC m=+4803.687300338" lastFinishedPulling="2026-03-13 10:32:02.303082146 +0000 UTC m=+4805.032982337" observedRunningTime="2026-03-13 10:32:02.687752371 +0000 UTC m=+4805.417652562" watchObservedRunningTime="2026-03-13 10:32:02.699643054 +0000 UTC m=+4805.429543245" Mar 13 10:32:03 crc kubenswrapper[4841]: I0313 10:32:03.686851 4841 generic.go:334] "Generic (PLEG): container finished" podID="c53bd3dc-7e14-4279-a826-0824fc7a9c43" containerID="e55cea03d5c6e85403404431d9fc6a1e27bb4e39f3cd33c1376e24ebb62abe1c" exitCode=0 Mar 13 10:32:03 crc kubenswrapper[4841]: I0313 10:32:03.686962 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556632-wqrbp" event={"ID":"c53bd3dc-7e14-4279-a826-0824fc7a9c43","Type":"ContainerDied","Data":"e55cea03d5c6e85403404431d9fc6a1e27bb4e39f3cd33c1376e24ebb62abe1c"} Mar 13 10:32:04 crc kubenswrapper[4841]: I0313 10:32:04.409060 4841 patch_prober.go:28] interesting pod/machine-config-daemon-h227v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 10:32:04 crc kubenswrapper[4841]: I0313 10:32:04.409125 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-h227v" podUID="e49b836b-f6cf-4cee-b1be-6bd7864fb7f2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 10:32:05 crc kubenswrapper[4841]: I0313 10:32:05.079126 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556632-wqrbp" Mar 13 10:32:05 crc kubenswrapper[4841]: I0313 10:32:05.158729 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltrpz\" (UniqueName: \"kubernetes.io/projected/c53bd3dc-7e14-4279-a826-0824fc7a9c43-kube-api-access-ltrpz\") pod \"c53bd3dc-7e14-4279-a826-0824fc7a9c43\" (UID: \"c53bd3dc-7e14-4279-a826-0824fc7a9c43\") " Mar 13 10:32:05 crc kubenswrapper[4841]: I0313 10:32:05.165209 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c53bd3dc-7e14-4279-a826-0824fc7a9c43-kube-api-access-ltrpz" (OuterVolumeSpecName: "kube-api-access-ltrpz") pod "c53bd3dc-7e14-4279-a826-0824fc7a9c43" (UID: "c53bd3dc-7e14-4279-a826-0824fc7a9c43"). InnerVolumeSpecName "kube-api-access-ltrpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:32:05 crc kubenswrapper[4841]: I0313 10:32:05.261515 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltrpz\" (UniqueName: \"kubernetes.io/projected/c53bd3dc-7e14-4279-a826-0824fc7a9c43-kube-api-access-ltrpz\") on node \"crc\" DevicePath \"\"" Mar 13 10:32:05 crc kubenswrapper[4841]: I0313 10:32:05.713130 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556632-wqrbp" event={"ID":"c53bd3dc-7e14-4279-a826-0824fc7a9c43","Type":"ContainerDied","Data":"1b980a6451b9116b16e2ba797f5a1e1e5089d6632f668c14c69d511de5da98ae"} Mar 13 10:32:05 crc kubenswrapper[4841]: I0313 10:32:05.713188 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b980a6451b9116b16e2ba797f5a1e1e5089d6632f668c14c69d511de5da98ae" Mar 13 10:32:05 crc kubenswrapper[4841]: I0313 10:32:05.713164 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556632-wqrbp" Mar 13 10:32:05 crc kubenswrapper[4841]: I0313 10:32:05.769752 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556626-525sx"] Mar 13 10:32:05 crc kubenswrapper[4841]: I0313 10:32:05.787081 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556626-525sx"] Mar 13 10:32:06 crc kubenswrapper[4841]: I0313 10:32:06.006701 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0869bd1f-0261-4f36-8bb2-de2338d1d6fc" path="/var/lib/kubelet/pods/0869bd1f-0261-4f36-8bb2-de2338d1d6fc/volumes" Mar 13 10:32:07 crc kubenswrapper[4841]: I0313 10:32:07.972084 4841 scope.go:117] "RemoveContainer" containerID="65558eab5d03a53db73913970f1fe54d498c44b6c62f18ec741a928af57043c6" Mar 13 10:32:10 crc kubenswrapper[4841]: I0313 10:32:10.954781 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pnfg7"] Mar 13 10:32:10 crc kubenswrapper[4841]: E0313 10:32:10.955802 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c53bd3dc-7e14-4279-a826-0824fc7a9c43" containerName="oc" Mar 13 10:32:10 crc kubenswrapper[4841]: I0313 10:32:10.955823 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c53bd3dc-7e14-4279-a826-0824fc7a9c43" containerName="oc" Mar 13 10:32:10 crc kubenswrapper[4841]: I0313 10:32:10.956123 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c53bd3dc-7e14-4279-a826-0824fc7a9c43" containerName="oc" Mar 13 10:32:10 crc kubenswrapper[4841]: I0313 10:32:10.957957 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pnfg7" Mar 13 10:32:10 crc kubenswrapper[4841]: I0313 10:32:10.976544 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pnfg7"] Mar 13 10:32:11 crc kubenswrapper[4841]: I0313 10:32:11.071314 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4-utilities\") pod \"certified-operators-pnfg7\" (UID: \"3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4\") " pod="openshift-marketplace/certified-operators-pnfg7" Mar 13 10:32:11 crc kubenswrapper[4841]: I0313 10:32:11.071570 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md8vr\" (UniqueName: \"kubernetes.io/projected/3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4-kube-api-access-md8vr\") pod \"certified-operators-pnfg7\" (UID: \"3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4\") " pod="openshift-marketplace/certified-operators-pnfg7" Mar 13 10:32:11 crc kubenswrapper[4841]: I0313 10:32:11.072008 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4-catalog-content\") pod \"certified-operators-pnfg7\" (UID: \"3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4\") " pod="openshift-marketplace/certified-operators-pnfg7" Mar 13 10:32:11 crc kubenswrapper[4841]: I0313 10:32:11.174020 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md8vr\" (UniqueName: \"kubernetes.io/projected/3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4-kube-api-access-md8vr\") pod \"certified-operators-pnfg7\" (UID: \"3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4\") " pod="openshift-marketplace/certified-operators-pnfg7" Mar 13 10:32:11 crc kubenswrapper[4841]: I0313 10:32:11.174110 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4-catalog-content\") pod \"certified-operators-pnfg7\" (UID: \"3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4\") " pod="openshift-marketplace/certified-operators-pnfg7" Mar 13 10:32:11 crc kubenswrapper[4841]: I0313 10:32:11.174241 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4-utilities\") pod \"certified-operators-pnfg7\" (UID: \"3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4\") " pod="openshift-marketplace/certified-operators-pnfg7" Mar 13 10:32:11 crc kubenswrapper[4841]: I0313 10:32:11.174701 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4-catalog-content\") pod \"certified-operators-pnfg7\" (UID: \"3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4\") " pod="openshift-marketplace/certified-operators-pnfg7" Mar 13 10:32:11 crc kubenswrapper[4841]: I0313 10:32:11.174743 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4-utilities\") pod \"certified-operators-pnfg7\" (UID: \"3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4\") " pod="openshift-marketplace/certified-operators-pnfg7" Mar 13 10:32:11 crc kubenswrapper[4841]: I0313 10:32:11.200876 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md8vr\" (UniqueName: \"kubernetes.io/projected/3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4-kube-api-access-md8vr\") pod \"certified-operators-pnfg7\" (UID: \"3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4\") " pod="openshift-marketplace/certified-operators-pnfg7" Mar 13 10:32:11 crc kubenswrapper[4841]: I0313 10:32:11.283533 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pnfg7" Mar 13 10:32:11 crc kubenswrapper[4841]: I0313 10:32:11.601583 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pnfg7"] Mar 13 10:32:11 crc kubenswrapper[4841]: I0313 10:32:11.775361 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pnfg7" event={"ID":"3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4","Type":"ContainerStarted","Data":"5cbd6bfb05bf5dc66e327a3d4e0c8effd01b25876bcf045535451e5659ab9b23"} Mar 13 10:32:12 crc kubenswrapper[4841]: I0313 10:32:12.795634 4841 generic.go:334] "Generic (PLEG): container finished" podID="3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4" containerID="b4ab1ade98e0c421652786c306b5be11d1dc1c7381a491e121a5e9b189808453" exitCode=0 Mar 13 10:32:12 crc kubenswrapper[4841]: I0313 10:32:12.795743 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pnfg7" event={"ID":"3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4","Type":"ContainerDied","Data":"b4ab1ade98e0c421652786c306b5be11d1dc1c7381a491e121a5e9b189808453"} Mar 13 10:32:13 crc kubenswrapper[4841]: I0313 10:32:13.813743 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pnfg7" event={"ID":"3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4","Type":"ContainerStarted","Data":"6b1ffa07469c6c05f3ae4f97f8d55944362e9c35716e76aa068ba28641ba050a"} Mar 13 10:32:15 crc kubenswrapper[4841]: I0313 10:32:15.831962 4841 generic.go:334] "Generic (PLEG): container finished" podID="3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4" containerID="6b1ffa07469c6c05f3ae4f97f8d55944362e9c35716e76aa068ba28641ba050a" exitCode=0 Mar 13 10:32:15 crc kubenswrapper[4841]: I0313 10:32:15.832129 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pnfg7" event={"ID":"3388ca7a-e64d-44ef-bfba-3f1eee3cb9a4","Type":"ContainerDied","Data":"6b1ffa07469c6c05f3ae4f97f8d55944362e9c35716e76aa068ba28641ba050a"}